Replies: 1 comment
-
|
If TF2 works, but not pytorch, then it seems an issue of the L-BFGS implementation in pytorch. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hello,
I have faced an issue when I'm using PyTorch Backend. When I switch my optimizer from Adam to "L-BFGS-B," after a few thousands iterations, I get NaN for my losses. Then, I cannot even use the best-achieved trained model for the prediction.
model.predictreturns nothing when this happens. It's worth mentioning that I don't face this issue when I have TF2 as my Backend. I was wondering maybe someone else had faced this problem before. Thanks.Beta Was this translation helpful? Give feedback.
All reactions