Open
Description
Hello,
first of all thank you very much for your implementation. I noticed a small "incompatibility" when using the scheduler. Since some update, Pytorch recommends using .get_last_lr()
over .get_lr()
. In their code they've added a warning.
def get_lr(self):
if not self._get_lr_called_within_step:
warnings.warn("To get the last learning rate computed by the scheduler, "
"please use `get_last_lr()`.", UserWarning)
When plugging your scheduler into existing (newer) code bases, this raises an error as some use get_last_lr()
. A quick fix would be adding the following to your code:
def get_last_lr(self):
return self.get_lr()
Best Regards,
L
Metadata
Metadata
Assignees
Labels
No labels