GroupNorm

class paddle.nn. GroupNorm ( num_groups, num_channels, epsilon=1e-05, weight_attr=None, bias_attr=None, data_format='NCHW', name=None ) [source]

This interface is used to construct a callable object of the GroupNorm class. For more details, refer to code examples. It implements the function of the Group Normalization Layer. Refer to Group Normalization .

Parameters
  • num_groups (int) – The number of groups that divided from channels.

  • num_channels (int) – The number of channels of input.

  • epsilon (float, optional) – The small value added to the variance to prevent division by zero. Default: 1e-05.

  • weight_attr (ParamAttr|bool, optional) – The parameter attribute for the learnable scale \(g\). If it is set to False, no scale will be added to the output units. If it is set to None, the bias is initialized one. Default: None.

  • bias_attr (ParamAttr|bool, optional) – The parameter attribute for the learnable bias \(b\). If it is set to False, no bias will be added to the output units. If it is set to None, the bias is initialized zero. Default: None.

  • data_format (str, optional) – Specify the input data format. Only NCHW is supported. Default: NCHW.

  • name (str, optional) – Name for the GroupNorm, default is None. For more information, please refer to Name..

Shape:
  • x: 4-D tensor with shape: (batch, num_features, height, weight).

  • output: 4-D tensor with same shape as input x.

Returns

None

Examples

import paddle
import numpy as np

paddle.disable_static()
np.random.seed(123)
x_data = np.random.random(size=(2, 6, 2, 2)).astype('float32')
x = paddle.to_tensor(x_data)
group_norm = paddle.nn.GroupNorm(num_channels=6, num_groups=6)
group_norm_out = group_norm(x)

print(group_norm_out.numpy())
forward ( input )

forward

Defines the computation performed at every call. Should be overridden by all subclasses.

Parameters
  • *inputs (tuple) – unpacked tuple arguments

  • **kwargs (dict) – unpacked dict arguments

extra_repr ( )

extra_repr

Extra representation of this layer, you can have custom implementation of your own layer.