glu¶
- api_attr
declarative programming (static graph)
-
paddle.fluid.nets.
glu
(input, dim=-1)[source] The Gated Linear Units(GLU) composed by split , sigmoid and elementwise_mul . Specifically, GLU will plit the input into two equal-sized parts, \(a\) and \(b\), along the given dimension and then compute as following:
\[{GLU}(a, b)= a \otimes \sigma(b)\]Refer to Language Modeling with Gated Convolutional Networks.
- Parameters
input (Variable) – The input variable which is a Tensor or LoDTensor. The supported data types include float32, float64 and float16 (only for GPU).
dim (int, optional) – The dimension along which to split. If \(dim < 0\), the dimension to split along is \(rank(input) + dim\). Default -1.
- Returns
Variable with half the size and same data type of input.
- Return type
Variable
Examples
import paddle.fluid as fluid data = fluid.data( name="words", shape=[-1, 6, 3, 9], dtype="float32") # shape of output: [-1, 3, 3, 9] output = fluid.nets.glu(input=data, dim=1)