ClipGradByValue

class paddle.nn. ClipGradByValue ( max, min=None ) [source]

Limit the value of multi-dimensional Tensor \(X\) to the range [min, max].

  • Any values less than min are set to min.

  • Any values greater than max are set to max.

The multi-dimensional Tensor \(X\) is not passed from this class, but the gradients of all parameters set in optimizer. If need_clip of specific param is False in its ParamAttr, then the gradients of this param will not be clipped.

Gradient clip will takes effect after being set in optimizer , see the document optimizer (for example: SGD).

Note

need_clip of ClipGradByValue HAS BEEN DEPRECATED since 2.0. Please use need_clip in ParamAttr to speficiy the clip scope.

Parameters
  • max (float) – The maximum value to clip by.

  • min (float, optional) – The minimum value to clip by. if not set by user, it will be set to -max automatically. In this case, max must be greater than \(0\).

Examples

>>> import paddle
>>> x = paddle.uniform([10, 10], min=-1.0, max=1.0, dtype='float32')
>>> linear = paddle.nn.Linear(in_features=10, out_features=10,
...                           weight_attr=paddle.ParamAttr(need_clip=True),
...                           bias_attr=paddle.ParamAttr(need_clip=False))
>>> out = linear(x)
>>> loss = paddle.mean(out)
>>> loss.backward()

>>> clip = paddle.nn.ClipGradByValue(min=-1, max=1)
>>> sdg = paddle.optimizer.SGD(learning_rate=0.1, parameters=linear.parameters(), grad_clip=clip)
>>> sdg.step()

Used in the guide/tutorials