How to use learning rate scheduler #15744
Unanswered
talhaanwarch
asked this question in
code help: CV
Replies: 1 comment 1 reply
-
You can add a lr_scheduler_step method inside the Lightning module class, which will be called by PyTorch Lightning at each step of the training loop to update the learning rate of the optimizer.
|
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi. In previous version of lightning i am using lr scheduler this way.
but upon installing latest version, i am getting error.
How i can i figure it out. I could find the example in documentation, if any one can provide a replacement code that would be great. I am not sure how to add
lr_scheduler_step
Beta Was this translation helpful? Give feedback.
All reactions