silu

paddle.nn.functional. silu ( x, name=None ) [source]

silu activation. .. math:

System Message: ERROR/3 (/usr/local/lib/python3.8/site-packages/paddle/nn/functional/activation.py:docstring of paddle.nn.functional.activation.silu, line 3)

Unexpected indentation.

silu(x) =

System Message: WARNING/2 (/usr/local/lib/python3.8/site-packages/paddle/nn/functional/activation.py:docstring of paddle.nn.functional.activation.silu, line 4)

Block quote ends without a blank line; unexpected unindent.

rac{x}{1 + e^{-x}}

Parameters:

x (Tensor): The input Tensor with data type float32, float64. name (str, optional): Name for the operation (optional, default is None).

System Message: ERROR/3 (/usr/local/lib/python3.8/site-packages/paddle/nn/functional/activation.py:docstring of paddle.nn.functional.activation.silu, line 9)

Unexpected indentation.

For more information, please refer to Name.

Returns:

A Tensor with the same data type and shape as x .

Examples:

System Message: WARNING/2 (/usr/local/lib/python3.8/site-packages/paddle/nn/functional/activation.py:docstring of paddle.nn.functional.activation.silu, line 16)

Explicit markup ends without a blank line; unexpected unindent.

import paddle import paddle.nn.functional as F

x = paddle.to_tensor([1.0, 2.0, 3.0, 4.0]) out = F.silu(x) # [ 0.731059, 1.761594, 2.857722, 3.928055 ]