leaky_relu

paddle.fluid.layers.leaky_relu(x, alpha=0.02, name=None)[source]

LeakyRelu Activation Operator.

$$out = max(x, alpha * x)$$

Parameters
  • x (Variable) – A LoDTensor or Tensor representing preactivation values. Must be one of the following types: float32, float64

  • alpha (FLOAT|0.02) – Slope of the activation function at x < 0

  • name (str|None) – The default value is None. Normally there is no need for user to set this property. For more information, please refer to Name

Returns

A LoDTensor or Tensor with the same type and size as that of x

Return type

output(Variable)

Examples

import paddle.fluid as fluid
import numpy as np

# Graph Organizing
x = fluid.layers.data(name="x", shape=[2], dtype="float32")
res = fluid.layers.leaky_relu(x, alpha=0.1)

# Create an executor using CPU as an example
exe = fluid.Executor(fluid.CPUPlace())

# Execute
x_i = np.array([[-1, 2], [3, -4]]).astype(np.float32)
res_val, = exe.run(fluid.default_main_program(), feed={'x':x_i}, fetch_list=[res])
print(res_val) # [[-0.1, 2], [3, -0.4]]