mxnet.lr_scheduler.CosineScheduler¶
-
class
mxnet.lr_scheduler.
CosineScheduler
(max_update, base_lr=0.01, final_lr=0, warmup_steps=0, warmup_begin_lr=0, warmup_mode='linear')[source]¶ Reduce the learning rate according to a cosine function
Calculate the new learning rate by:
final_lr + (start_lr - final_lr) * (1+cos(pi * nup/max_nup))/2 if nup < max_nup, 0 otherwise.
- Parameters
max_update (int) – maximum number of updates before the decay reaches 0
base_lr (float) – base learning rate
final_lr (float) – final learning rate after all steps
warmup_steps (int) – number of warmup steps used before this scheduler starts decay
warmup_begin_lr (float) – if using warmup, the learning rate from which it starts warming up
warmup_mode (string) – warmup can be done in two modes. ‘linear’ mode gradually increases lr with each step in equal increments ‘constant’ mode keeps lr at warmup_begin_lr for warmup_steps
-
__init__
(max_update, base_lr=0.01, final_lr=0, warmup_steps=0, warmup_begin_lr=0, warmup_mode='linear')[source]¶ Initialize self. See help(type(self)) for accurate signature.
Methods
__init__
(max_update[, base_lr, final_lr, …])Initialize self.
get_warmup_lr
(num_update)