mxnet.test_utils.numeric_grad¶
-
mxnet.test_utils.
numeric_grad
(executor, location, aux_states=None, eps=0.0001, use_forward_train=True, dtype=<class 'numpy.float32'>)[source]¶ Calculates a numeric gradient via finite difference method.
Class based on Theano’s theano.gradient.numeric_grad [1]
- Parameters
executor (Executor) – Executor that computes the forward pass.
location (list of numpy.ndarray or dict of str to numpy.ndarray) – Argument values used as location to compute gradient Maps the name of arguments to the corresponding numpy.ndarray. Value of all the arguments must be provided.
aux_states (None or list of numpy.ndarray or dict of str to numpy.ndarray, optional) – Auxiliary states values used as location to compute gradient Maps the name of aux_states to the corresponding numpy.ndarray. Value of all the auxiliary arguments must be provided.
eps (float, optional) – Epsilon for the finite-difference method.
use_forward_train (bool, optional) – Whether to use is_train=True in testing.
dtype (np.float16 or np.float32 or np.float64) – Datatype for mx.nd.array.
References
..[1] https://github.com/Theano/Theano/blob/master/theano/gradient.py