fused_elemwise_activation¶
- paddle.fluid.contrib.layers.nn. fused_elemwise_activation ( x, y, functor_list, axis=- 1, scale=0.0, save_intermediate_out=True ) [source]
- 
         Fused elementwise_add/mul and activation layers This function computes an elementwise_add/mul cooperated with an activation. \[out = Unary(Binary(x, y))\]or \[out = Binary(x, Unary(y))\]Unary operators can be: scale, relu, tanh. Binary operators can be: elementwise_add, elementwise_mul. - Parameters
- 
           - x (Variable) – left operation of the binary operator. 
- y (Variable) – right operator of the binary operator. 
- functor_list (list of str) – types of operator which will be executed by this layer. For example, [‘elementwise_add’, ‘relu’] (out = elementwise_add(x, relu(y))), or [‘relu’, ‘elemmentwise_add’] (out = relu(elementwise_add(x, y))). 
- axis (int32, default -1) – axis of elementwise op. 
- scale (float32, default 0) – parameter of scale op. 
- save_intermediate_out (bool, default True) – whether to save the intermediate result, Unary(y) or Binary(x, y). 
 
- Returns
- 
           The computation result. 
- Return type
- 
           Variable 
 
