Commit b9635b7
Hananeh Oliaei
Configured optimizer and learning rate scheduling with a configurable warm-up length.
1. Previously, the linear learning rate in the schedulers (LinearLR, CosineAnnealingLR, and SequentialLR) was hard-coded to a single step. This has been replaced with the configurable `warmup_length`, enabling a more flexible and consistent setup p across all schedulers.
2. Defined a `warmup_length` parameter to allow consistent control over the linear warm-up phase.1 parent 30490c1 commit b9635b7
2 files changed
+8
-4
lines changed| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
30 | 30 | | |
31 | 31 | | |
32 | 32 | | |
| 33 | + | |
33 | 34 | | |
34 | 35 | | |
35 | | - | |
| 36 | + | |
36 | 37 | | |
37 | 38 | | |
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
56 | 56 | | |
57 | 57 | | |
58 | 58 | | |
59 | | - | |
| 59 | + | |
| 60 | + | |
| 61 | + | |
| 62 | + | |
60 | 63 | | |
61 | 64 | | |
62 | | - | |
| 65 | + | |
63 | 66 | | |
64 | 67 | | |
65 | | - | |
| 68 | + | |
66 | 69 | | |
67 | 70 | | |
0 commit comments