local_response_norm¶
- paddle.nn.functional. local_response_norm ( x, size, alpha=0.0001, beta=0.75, k=1.0, data_format='NCHW', name=None ) [source]
- 
         Local Response Normalization performs a type of “lateral inhibition” by normalizing over local input regions. For more information, please refer to ImageNet Classification with Deep Convolutional Neural Networks The formula is as follows: \[Output(i, x, y) = Input(i, x, y) / \left(k + \alpha \sum\limits^{\min(C-1, i + size/2)}_{j = \max(0, i - size/2)}(Input(j, x, y))^2\right)^{\beta}\]In the above equation: - \(size\) : The number of channels to sum over. 
- \(k\) : The offset (avoid being divided by 0). 
- \(\\alpha\) : The scaling parameter. 
- \(\\beta\) : The exponent parameter. 
 - Parameters
- 
           - x (Tensor) – The input 3-D/4-D/5-D tensor. The data type is float32. 
- size (int) – The number of channels to sum over. 
- alpha (float, optional) – The scaling parameter, positive. Default:1e-4 
- beta (float, optional) – The exponent, positive. Default:0.75 
- k (float, optional) – An offset, positive. Default: 1.0 
- data_format (str, optional) – Specify the data format of the input, and the data format of the output will be consistent with that of the input. An optional string from: If x is 3-D Tensor, the string could be “NCL” or “NLC” . When it is “NCL”, the data is stored in the order of: [batch_size, input_channels, feature_length]. If x is 4-D Tensor, the string could be “NCHW”, “NHWC”. When it is “NCHW”, the data is stored in the order of: [batch_size, input_channels, input_height, input_width]. If x is 5-D Tensor, the string could be “NCDHW”, “NDHWC” . When it is “NCDHW”, the data is stored in the order of: [batch_size, input_channels, input_depth, input_height, input_width]. 
- name (str, optional) – Name for the operation (optional, default is None). For more information, please refer to Name. 
 
- Returns
- 
           A tensor storing the transformation result with the same shape and data type as input. 
 Examples: import paddle x = paddle.rand(shape=(3, 3, 112, 112), dtype="float32") y = paddle.nn.functional.local_response_norm(x, size=5) print(y.shape) # [3, 3, 112, 112] 
