soft_relu

paddle.fluid.layers.soft_relu(x, threshold=40.0, name=None)[source]

SoftRelu Activation Operator.

$out = ln(1 + exp(max(min(x, threshold), -threshold)))$

Parameters
  • x (Variable) – Input of soft_relu operator. Data type can be float32, float64.

  • threshold (float, optional) – The threshold value of soft_relu, default value being 40.0.

  • name (str, optional) – The default value is None. Normally there is no need for user to set this property. For more information, please refer to Name .

Returns

Output of soft_relu operator, shape and LoD same as input.

Return type

Variable(Tensor|LoDTensor))

Examples

import paddle.fluid as fluid
import numpy as np

inputs = fluid.layers.data(name="x", shape=[2, 2], dtype="float32")
output = fluid.layers.soft_relu(inputs, threshold=20.0)

exe = fluid.Executor(fluid.CPUPlace())
exe.run(fluid.default_startup_program())

img = np.array([[0, 1],[2, 3]]).astype(np.float32)

res = exe.run(fluid.default_main_program(), feed={'x':img}, fetch_list=[output])
print(res) # [array([[0.6931472, 1.3132616], [2.126928 , 3.0485873]], dtype=float32)]