alpha_dropout

paddle.nn.functional. alpha_dropout ( x, p=0.5, training=True, name=None ) [source]

Alpha Dropout is a type of Dropout that maintains the self-normalizing property. For an input with zero mean and unit standard deviation, the output of Alpha Dropout maintains the original mean and standard deviation of the input. Alpha Dropout fits well to SELU activate function by randomly setting activations to the negative saturation value.

Parameters
  • x (Tensor) – The input tensor. The data type is float32 or float64.

  • p (float | int) – Probability of setting units to zero. Default 0.5.

  • training (bool) – A flag indicating whether it is in train phrase or not. Default True.

  • name (str, optional) – Name for the operation (optional, default is None). For more information, please refer to Name.

Returns

A Tensor representing the dropout, has same shape and data type as x.

Examples

import paddle

x = paddle.to_tensor([[-1, 1], [-1, 1]]).astype(paddle.float32)
y_train = paddle.nn.functional.alpha_dropout(x, 0.5)
y_test = paddle.nn.functional.alpha_dropout(x, 0.5, training=False)
print(y_train)
# Tensor(shape=[2, 2], dtype=float32, place=Place(cpu), stop_gradient=True,
#        [[-0.10721093, -0.77919382],
#         [-0.10721093,  1.66559887]]) (randomly)
print(y_test)
# Tensor(shape=[2, 2], dtype=float32, place=Place(cpu), stop_gradient=True,
#        [[-1.,  1.],
#         [-1.,  1.]])