mxnet.initializer.MSRAPrelu¶
-
class
mxnet.initializer.MSRAPrelu(factor_type='avg', slope=0.25)[source]¶ Initialize the weight according to a MSRA paper.
This initializer implements Delving Deep into Rectifiers: Surpassing Human-Level Performance on ImageNet Classification, available at https://arxiv.org/abs/1502.01852.
This initializer is proposed for initialization related to ReLu activation, it maked some changes on top of Xavier method.
- Parameters
factor_type (str, optional) – Can be
'avg','in', or'out'.slope (float, optional) – initial slope of any PReLU (or similar) nonlinearities.
-
__init__(factor_type='avg', slope=0.25)[source]¶ Initialize self. See help(type(self)) for accurate signature.
Methods
__init__([factor_type, slope])Initialize self.
dumps()Saves the initializer to string
set_verbosity([verbose, print_func])Switch on/off verbose mode