Pad2D

class paddle.nn. Pad2D ( padding, mode='constant', value=0.0, data_format='NCHW', name=None ) [source]

This interface is used to construct a callable object of the Pad2D class. Pad tensor according to ‘pad’, ‘mode’ and ‘value’. If mode is ‘reflect’, pad[0] and pad[1] must be no greater than width-1. The height dimension has the same condition.

Parameters
  • padding (Tensor|list[int]|int) – The padding size with data type int. If is int, use the same padding in all dimensions. Else [len(padding)/2] dimensions of input will be padded. The pad has the form (pad_left, pad_right, pad_top, pad_bottom).

  • mode (str, optional) –

    Four modes: ‘constant’ (default), ‘reflect’, ‘replicate’, ‘circular’. Default is ‘constant’.

    • ’constant’ mode, uses a constant value to pad the input tensor.

    • ’reflect’ mode, uses reflection of the input boundaries to pad the input tensor.

    • ’replicate’ mode, uses input boundaries to pad the input tensor.

    • ’circular’ mode, uses circular input to pad the input tensor.

  • value (float, optional) – The value to fill the padded areas. Default is \(0.0\)

  • data_format (str, optional) – An string from: “NCHW”, “NHWC”. Specify the data format of the input data. Default is “NCHW”。

  • name (str, optional) – For details, please refer to Name. Generally, no setting is required. Default: None.

Returns

None

Examples

import paddle
import paddle.nn as nn

input_shape = (1, 1, 2, 3)
pad = [1, 0, 1, 2]
mode = "constant"
data = paddle.arange(paddle.prod(paddle.to_tensor(input_shape)), dtype="float32").reshape(input_shape) + 1
my_pad = nn.Pad2D(padding=pad, mode=mode)
result = my_pad(data)
print(result)
# [[[[0. 0. 0. 0.]
#    [0. 1. 2. 3.]
#    [0. 4. 5. 6.]
#    [0. 0. 0. 0.]
#    [0. 0. 0. 0.]]]]
forward ( x )

forward

Defines the computation performed at every call. Should be overridden by all subclasses.

Parameters
  • *inputs (tuple) – unpacked tuple arguments

  • **kwargs (dict) – unpacked dict arguments

extra_repr ( )

extra_repr

Extra representation of this layer, you can have custom implementation of your own layer.