GroupNorm

class paddle.fluid.dygraph.GroupNorm(name_scope, groups, epsilon=1e-05, param_attr=None, bias_attr=None, act=None, data_layout='NCHW')[source]

This interface is used to construct a callable object of the GroupNorm class. For more details, refer to code examples. It implements the function of the Group Normalization Layer. Refer to Group Normalization .

Parameters
  • name_scope (str) – The name of this class.

  • groups (int) – The number of groups that divided from channels.

  • epsilon (float, optional) – The small value added to the variance to prevent division by zero. Default: 1e-05.

  • param_attr (ParamAttr, optional) – The parameter attribute for the learnable scale \(g\). If it is set to False, no scale will be added to the output units. If it is set to None, the bias is initialized one. Default: None.

  • bias_attr (ParamAttr, optional) – The parameter attribute for the learnable bias \(b\). If it is set to False, no bias will be added to the output units. If it is set to None, the bias is initialized zero. Default: None.

  • act (str, optional) – Activation to be applied to the output of group normalizaiton. Default: None.

  • data_layout (str, optional) – Specify the input data format. Only NCHW is supported. Default: NCHW.

Returns

None

Examples

import paddle.fluid as fluid
import numpy as np

with fluid.dygraph.guard():
    x = np.random.random((8, 32, 32)).astype('float32')
    groupNorm = fluid.dygraph.nn.GroupNorm('GroupNorm', groups=4)
    ret = groupNorm(fluid.dygraph.base.to_variable(x))
forward(input)

Defines the computation performed at every call. Should be overridden by all subclasses.

Parameters
  • *inputs (tuple) – unpacked tuple arguments

  • **kwargs (dict) – unpacked dict arguments