PRelu

class paddle.fluid.dygraph.PRelu(name_scope, mode, param_attr=None)[source]

This interface is used to construct a callable object of the PRelu class. For more details, refer to code examples. It implements three activation methods of the PRelu activation function.

Equation:

\[y = \max(0, x) + \alpha * \min(0, x)\]
Parameters
  • name_scope (str) – The name of this class.

  • mode (str) – The mode for weight sharing. It supports all, channel and element. all: all elements share same weight channel:elements in a channel share same weight element:each element has a weight

  • param_attr (ParamAttr, optional) – The parameter attribute for the learnable weight (alpha). Default: None.

Attribute:

weight (Parameter): the learnable weights of this layer.

Returns

None

Examples

import paddle.fluid as fluid
from paddle.fluid.dygraph.base import to_variable
import numpy as np

inp_np = np.ones([5, 200, 100, 100]).astype('float32')
with fluid.dygraph.guard():
    inp_np = to_variable(inp_np)
    mode = 'channel'
    prelu = fluid.PRelu(
       'prelu',
       mode=mode,
       param_attr=fluid.ParamAttr(initializer=fluid.initializer.Constant(1.0)))
    dy_rlt = prelu(inp_np)
forward(input)

Defines the computation performed at every call. Should be overridden by all subclasses.

Parameters
  • *inputs (tuple) – unpacked tuple arguments

  • **kwargs (dict) – unpacked dict arguments