silu

paddle.nn.functional. silu ( x, name=None ) [source]

silu activation

\[silu(x) = \frac{x}{1 + e^{-x}}\]

Where \(x\) is the input Tensor.

Parameters
  • x (Tensor) – The input Tensor with data type bfloat16, float16, float32, float64.

  • name (str, optional) – For details, please refer to Name. Generally, no setting is required. Default: None.

Returns

A Tensor with the same data type and shape as x.

Examples

import paddle
import paddle.nn.functional as F

x = paddle.to_tensor([1.0, 2.0, 3.0, 4.0])
out = F.silu(x) # [ 0.731059, 1.761594, 2.857722, 3.928055 ]