celu

paddle.nn.functional. celu ( x, alpha=1.0, name=None ) [source]

celu activation.

Apply the following operation to each element of the input Tensor according to the Continuously Differentiable Exponential Linear Units.

\[\operatorname{celu}(x) = \max(0, x) + \min(0, \alpha * (\mathrm{e}^{x/\alpha}-1))\]
Parameters
  • x (Tensor) – The input Tensor with data type float16, float32, or float64.

  • alpha (float, optional) – The ‘alpha’ value of the CELU formula. Default is 1.0.

  • name (str, optional) – For details, please refer to Name. Generally, no setting is required. Default: None.

Returns

A Tensor with the same data type and shape as x .

Examples

>>> import paddle
>>> import paddle.nn.functional as F

>>> x = paddle.to_tensor([[-1., 6.], [1., 15.6]])
>>> out = F.celu(x, alpha=0.2)
>>> print(out)
Tensor(shape=[2, 2], dtype=float32, place=Place(cpu), stop_gradient=True,
[[-0.19865242,  6.        ],
 [ 1.        , 15.60000038]])