# silu¶

paddle.nn.functional. silu ( x, name=None ) [source]

silu activation

$silu(x) = \frac{x}{1 + e^{-x}}$

Where $$x$$ is the input Tensor.

Parameters
• x (Tensor) – The input Tensor with data type float32, float64.

• name (str, optional) – For details, please refer to Name. Generally, no setting is required. Default: None.

Returns

A Tensor with the same data type and shape as x.

Examples

import paddle