selu¶
- paddle.nn.functional. selu ( x, scale=1.0507009873554805, alpha=1.6732632423543772, name=None ) [source]
- 
         selu activation \[\begin{split}selu(x)= scale * \left\{ \begin{array}{lcl} x,& &\text{if } \ x > 0 \\ alpha * e^{x} - alpha,& &\text{if } \ x <= 0 \end{array} \right.\end{split}\]- Parameters
- 
           - x (Tensor) – The input Tensor with data type float32, float64. 
- scale (float, optional) – The value of scale(must be greater than 1.0) for selu. Default is 1.0507009873554804934193349852946 
- alpha (float, optional) – The value of alpha(must be no less than zero) for selu. Default is 1.6732632423543772848170429916717 
- name (str, optional) – For details, please refer to Name. Generally, no setting is required. Default: None. 
 
- Returns
- 
           A Tensor with the same data type and shape as x.
 Examples import paddle import paddle.nn.functional as F x = paddle.to_tensor([[0.0, 1.0],[2.0, 3.0]]) out = F.selu(x) print(out) # [[0, 1.050701],[2.101402, 3.152103]] 
