LocalResponseNorm

class paddle.nn. LocalResponseNorm ( size, alpha=0.0001, beta=0.75, k=1.0, data_format='NCHW', name=None ) [source]

Local Response Normalization performs a type of “lateral inhibition” by normalizing over local input regions. For more information, please refer to ImageNet Classification with Deep Convolutional Neural Networks

See more details in local_response_norm .

Parameters
  • size (int) – The number of channels to sum over.

  • alpha (float, optional) – The scaling parameter, positive. Default:1e-4

  • beta (float, optional) – The exponent, positive. Default:0.75

  • k (float, optional) – An offset, positive. Default: 1.0

  • data_format (str, optional) – Specify the data format of the input, and the data format of the output will be consistent with that of the input. An optional string from: If input is 3-D Tensor, the string could be “NCL” or “NLC” . When it is “NCL”, the data is stored in the order of: [batch_size, input_channels, feature_length]. If input is 4-D Tensor, the string could be “NCHW”, “NHWC”. When it is “NCHW”, the data is stored in the order of: [batch_size, input_channels, input_height, input_width]. If input is 5-D Tensor, the string could be “NCDHW”, “NDHWC” . When it is “NCDHW”, the data is stored in the order of: [batch_size, input_channels, input_depth, input_height, input_width].

  • name (str, optional) – Name for the operation (optional, default is None). For more information, please refer to Name.

Shape:
  • input: 3-D/4-D/5-D tensor.

  • output: 3-D/4-D/5-D tensor, the same shape as input.

Examples

>>> import paddle

>>> x = paddle.rand(shape=(3, 3, 112, 112), dtype="float32")
>>> m = paddle.nn.LocalResponseNorm(size=5)
>>> y = m(x)
>>> print(y.shape)
[3, 3, 112, 112]
forward ( input )

forward

Defines the computation performed at every call. Should be overridden by all subclasses.

Parameters
  • *inputs (tuple) – unpacked tuple arguments

  • **kwargs (dict) – unpacked dict arguments

extra_repr ( )

extra_repr

Extra representation of this layer, you can have custom implementation of your own layer.