gelu

paddle.nn.functional. gelu ( x, approximate=False, name=None ) [source]

gelu activation.

The activation function of Gelu is calculated element by element. More information refers to :ref: Gaussian Error Linear Units.

if approximate is True

\[gelu(x) = 0.5 * x * (1 + tanh(\sqrt{\frac{2}{\pi}} * (x + 0.044715x^{3})))\]

else

\[gelu(x) = 0.5 * x * (1 + erf(\frac{x}{\sqrt{2}}))\]
Parameters
  • x (Tensor) – The input Tensor with data type float32, float64.

  • approximate (bool, optional) – Whether to enable approximation. Default is False.

  • name (str, optional) – For details, please refer to Name. Generally, no setting is required. Default: None.

Returns

A Tensor with the same data type and shape as x .

Examples

>>> import paddle
>>> import paddle.nn.functional as F

>>> x = paddle.to_tensor([[-1, 0.5], [1, 1.5]])
>>> out1 = F.gelu(x)
>>> print(out1)
Tensor(shape=[2, 2], dtype=float32, place=Place(cpu), stop_gradient=True,
[[-0.15865529,  0.34573123],
 [ 0.84134471,  1.39978933]])
>>> out2 = F.gelu(x, True)
>>> print(out2)
Tensor(shape=[2, 2], dtype=float32, place=Place(cpu), stop_gradient=True,
[[-0.15880796,  0.34571400],
 [ 0.84119201,  1.39957154]])