Skip to content

Do additional training points always decrease uncertainty? #198

@nbgl

Description

@nbgl

Say I have a predictor p trained on data points x1, …, xn. This predictor has a variance of v at t.

Say I also have a predictor p' trained on x1, …, xn+1. This has a variance of v' at t.

Is it necessarily true that v' ≤ v? My conjecture is yes, but I don’t seem to be able to prove it.

Metadata

Metadata

Assignees

No one assigned

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions