Relu Activation Operator.
$$out = max(x, 0)$$
x (Variable) – Input of Relu operator, an N-D Tensor, with data type float32, float64 or float16
name (str, optional) – The default value is None. Normally there is no need for user to set this property. For more information, please refer to Name.
Output of Relu operator, a Tensor with shape same as input
- Return type
import paddle.fluid as fluid import numpy as np in1 = np.array([[-1,0],[1,2.6]]) with fluid.dygraph.guard(): x1 = fluid.dygraph.to_variable(in1) out1 = fluid.layers.relu(x1) print(out1.numpy()) # [[0. 0. ] # [1. 2.6]]