# rrelu¶

paddle.nn.functional. rrelu ( x, lower=0.125, upper=0.3333333333333333, training=True, name=None ) [source]

rrelu activation.

Applies the randomized leaky rectified liner unit function to improve generalization performance, as described in the paper: Empirical Evaluation of Rectified Activations in Convolutional Network

During training, randomly samples the negative slope for activation values as described below:

$\begin{split}rrelu(x)= \left\{ \begin{array}{rcl} x, & & if \ x >= 0 \\ a * x, & & otherwise \\ \end{array} \right.\end{split}$

where $$x$$ is the input tensor, $$a$$ is randomly sampled from uniform distribution in range ($$lower$$, $$upper$$),

In the test phase, the negative slope will take the average value of $$lower$$ and $$upper$$:

$\begin{split}rrelu(x)= \left\{ \begin{array}{rcl} x, & & if \ x >= 0 \\ (lower + upper) * 0.5 * x, & & otherwise \\ \end{array} \right.\end{split}$

where $$x$$ is the input tensor, $$lower$$ and $$upper$$ are the bounds of uniform distribution.

Parameters
• x (Tensor) – The input Tensor with data type float16, float32, float64.

• lower (float, optional) – The lower bound of uniform distribution. Default: 0.125.

• upper (float, optional) – The upper bound of uniform distribution. Default: 0.333.

• training (bool, optional) – Current mode is in training or others. Default is True.

• name (str, optional) – For details, please refer to Name. Generally, no setting is required. Default: None.

Returns

A Tensor with the same data type and shape as x .

Examples

import paddle

input_tensor = paddle.to_tensor([[[[-2.0,  3.0, -4.0,  5.0],
[ 3.0, -4.0,  5.0, -6.0],
[-7.0, -8.0,  8.0,  9.0]],
[[ 1.0, -2.0, -3.0,  4.0],
[-5.0,  6.0,  7.0, -8.0],
[ 6.0,  7.0,  8.0,  9.0]]]], dtype='float32')

out = F.rrelu(input_tensor, 0.1, 0.3)
print(out)
#[[[[-0.20000899  3.         -0.8810822   5.        ]
#   [ 3.         -0.55175185  5.         -1.0776101 ]
#   [-1.0680687  -1.9896201   8.          9.        ]]
#  [[ 1.         -0.5238267  -0.65515125  4.        ]
#   [-1.3766339   6.          7.         -2.3465784 ]
#   [ 6.          7.          8.          9.        ]]]]