llm.schedulers
Custom learning rate schedules.
LinearWarmupLR
¶
LinearWarmupLR(
optimizer: Optimizer,
total_steps: int,
warmup_steps: int = 0,
last_epoch: int = -1,
)
Bases: _LRScheduler
Linear warmup and decay LR scheduler.
Source: ColossalAI
Parameters:
-
optimizer
(Optimizer
) –Optimizer to adjust learning rate of.
-
total_steps
(int
) –Total training steps.
-
warmup_steps
(int
, default:0
) –Steps to linearly warmup the learning rate.
-
last_epoch
(int
, default:-1
) –Optional last epoch.
Source code in llm/schedulers.py
get_lr
¶
Compute the current learning rate.