leaky_relu

paddle.nn.functional. leaky_relu ( x, negative_slope=0.01, name=None ) [source]

leaky_relu activation

\[\begin{split}leaky\\_relu(x)= \\left\\{ \\begin{aligned} &x, & & if \\ x >= 0 \\\\ &negative\_slope * x, & & otherwise \\\\ \\end{aligned} \\right. \\\\\end{split}\]
Parameters
  • x (Tensor) – The input Tensor with data type float32, float64.

  • negative_slope (float, optional) – Slope of the activation function at \(x < 0\) . Default is 0.01.

  • name (str, optional) – Name for the operation (optional, default is None). For more information, please refer to Name.

Returns

A Tensor with the same data type and shape as x .

Examples

import paddle
import paddle.nn.functional as F

x = paddle.to_tensor([-2., 0., 1.])
out = F.leaky_relu(x) # [-0.02, 0., 1.]