mxnet.optimizer¶
Optimizer API of MXNet.
Optimization methods¶
|
The AdaDelta optimizer. |
|
AdaGrad optimizer. |
|
The Adam optimizer. |
|
The AdaMax optimizer. |
|
The DCASGD optimizer. |
|
The FTML optimizer. |
|
The Ftrl optimizer. |
|
The Large Batch SGD optimizer with momentum and weight decay. |
|
Nesterov accelerated SGD. |
|
The Nesterov Adam optimizer. |
|
The base class inherited by all optimizers. |
|
The RMSProp optimizer. |
|
The SGD optimizer with momentum and weight decay. |
|
Stochastic Gradient Riemannian Langevin Dynamics. |
|
The Signum optimizer that takes the sign of gradient or momentum. |
|
Updater for kvstore. |
Helper functions¶
|
Instantiates an optimizer with a given name and kwargs. |
|
Returns a closure of the updater needed for kvstore. |
|
Registers a new optimizer. |