dropout2d( x, p=0.5, training=True, data_format='NCHW', name=None )
Randomly zero out entire channels (in the batched input 4d tensor with the shape NCHW , a channel is a 2D feature map with the shape HW ). Each channel will be zeroed out independently on every forward call with probability p using samples from a Bernoulli distribution.
paddle.nn.functional.dropoutfor more details.
x (Tensor) – The input is 4-D Tensor with shape [N, C, H, W] or [N, H, W, C]. The data type is float32 or float64.
p (float) – Probability of setting units to zero. Default 0.5.
training (bool) – A flag indicating whether it is in train phrase or not. Default True.
data_format (str, optional) – Specify the data format of the input, and the data format of the output will be consistent with that of the input. An optional string from NCHW or NHWC . The default is NCHW . When it is NCHW , the data is stored in the order of: [batch_size, input_channels, input_height, input_width].
name (str, optional) – Name for the operation (optional, default is None). For more information, please refer to Name.
A Tensor representing the dropout2d, has same shape and data type as x .
import paddle import numpy as np x = np.random.random(size=(2, 3, 4, 5)).astype('float32') x = paddle.to_tensor(x) y_train = paddle.nn.functional.dropout2d(x) #train y_test = paddle.nn.functional.dropout2d(x, training=False) #test for i in range(2): for j in range(3): print(x.numpy()[i,j,:,:]) print(y_train.numpy()[i,j,:,:]) # may all 0 print(y_test.numpy()[i,j,:,:])