elu

paddle.fluid.layers.elu(x, alpha=1.0, name=None)[source]

ELU Activation Operator.

Applies the following element-wise computation on the input according to https://arxiv.org/abs/1511.07289.

\(out = \max(0, x) + \min(0, \alpha * (e^x - 1))\)

Parameters
  • x (Variable) – The input is a multi-dimensional Tensor. The data type is float32 or float64

  • alpha (FLOAT|1.0) – The alpha value of ELU

  • name (str|None) – The default value is None. Normally there is no need for user to set this property. For more information, please refer to Name.

Returns

The output is a multi-dimensional Tensor which has same dimension and data type as the x

Return type

Variable

Examples

import paddle.fluid as fluid
import numpy as np

input_elu = np.array([[-1,6],[1,15.6]])
with fluid.dygraph.guard():
    x = fluid.dygraph.to_variable(input_elu)
    y = fluid.layers.elu(x, alpha=0.2)
    print(y.numpy())
    # [[-0.12642411  6.        ]
    # [ 1.          15.6       ]]