clip_grad_norm_

paddle.nn.utils. clip_grad_norm_ ( parameters, max_norm, norm_type=2.0, error_if_nonfinite=False ) [source]

Clips gradient norm of the iteratable parameters.

Norms are calculated together on all gradients, just as they are connected into one vector. The gradient will be modified in place.

This API can only run in dynamic graph mode, not static graph mode.

Parameters
  • parameters (Iterable[paddle.Tensor] or paddle.Tensor) – Tensors or a single Tensor that will be normalized gradients

  • max_norm (float or int) – max norm of the gradients

  • norm_type (float or int) – type of the used p-norm. Can be inf for infinity norm.

  • error_if_nonfinite (bool) – if True, throw an error if the total norm of the gradients from parameters is nan, inf, or -inf.

Returns

Total norm of the parameter gradients (treated as a single vector).

Example

>>> import paddle

>>> x = paddle.uniform([10, 10], min=-1.0, max=1.0, dtype='float32')
>>> max_norm = float(5.0)
>>> linear = paddle.nn.Linear(in_features=10, out_features=10)
>>> out = linear(x)
>>> loss = paddle.mean(out)
>>> loss.backward()

>>> paddle.nn.utils.clip_grad_norm_(linear.parameters(), max_norm)

>>> sdg = paddle.optimizer.SGD(learning_rate=0.1, parameters=linear.parameters())
>>> sdg.step()