Prevent zero-variance instability in BaseProbaRegressor.predict_proba#956
Open
kindler-king wants to merge 5 commits intosktime:mainfrom
Open
Prevent zero-variance instability in BaseProbaRegressor.predict_proba#956kindler-king wants to merge 5 commits intosktime:mainfrom
kindler-king wants to merge 5 commits intosktime:mainfrom
Conversation
…nverter_store) In BaseProbaRegressor._check_C, the censoring indicator C was being converted using self._y_converter_store instead of the dedicated self._C_converter_store. This could silently corrupt inverse-transform state when y and C have different mtypes (e.g. pd.DataFrame vs ndarray), causing both to share the same converter dictionary. Also fixes the stale copy-paste comment that said 'convert y to y_inner_mtype' inside _check_C. Fixes: sktime#749
fkiraly
requested changes
Mar 21, 2026
Collaborator
There was a problem hiding this comment.
I would say this is a hack. Instead of clipping it, I would instead return a Delta distribution if the variance is below machine epsilon (possibly times a factor).
Also, code formatting tests are failing. Please look at the dev guide, and pre-commit.
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Reference Issues/PRs
Fixes #955
What does this implement/fix?
This PR fixes a numerical instability in
BaseProbaRegressor.predict_proba.When
predict_varreturns 0, the fallback Normal distribution is constructed withsigma=0, which leads to divide-by-zero warnings and NaN values when evaluating pdf or log_pdf.To prevent this, the predicted variance is clipped to machine epsilon before computing the standard deviation:
pred_var = np.clip(pred_var, np.finfo(float).eps, None)This ensures the resulting Normal distribution always has a strictly positive scale while leaving normal model outputs effectively unchanged.
Does your contribution introduce a new dependency?
No.
What should a reviewer concentrate their feedback on?
Did you add any tests for the change?
Yes.
A regression test was added that uses a mock regressor returning zero variance and verifies that:
predict_proba().pdf()andlog_pdf()remain finiteno numerical warnings are raised