Replies: 1 comment
-
It's not that uncommon for numerical optimizers like JAXopt to end up in parts of parameter space that cause NaNs, so I'm not totally shocked by this outcome. You may need to tune the parameters of your optimizer to take smaller steps or reparameterize your problem to exclude invalid regions. It can be useful to try to identify a set of inputs where you can find NaNs without optimization to try to diagnose what's going on. In the meantime, you could also try sharing the full stack trace from above that error message, but I expect tuning the algorithms will be the most fruitful! |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hello, I am quite new to all of this, so I'm sorry if I'm doing something wrong :)
I am writing a pretty complex MLE using quite intensively both JAX and JAXopt. Everything is working quite well, but when I try to find the gradient of the final likelihood (that is found itself by running some optimization and bisection algorithms) I find some NaN values.
Setting
the NaN values don't disappear, and setting
I get the message
Note that this appears even after removing the jit decorator.
I am thus pretty blocked, and don't really even know where these nan values are coming from. What is a reasonable step to take, after all these "canonical" steps failed? Should I just refrain from the idea of managing to have some sort of gradient from such a complex function (that would be something pretty amazing to have, but clearly not fundamental)?
Beta Was this translation helpful? Give feedback.
All reactions