# gumbel_softmax¶

paddle.nn.functional. gumbel_softmax ( x, temperature=1.0, hard=False, axis=- 1, name=None ) [source]

Samples from the Gumbel-Softmax distribution and optionally discretizes. temperature is denoted by t. The calculation process is as follows:

First, generate gumbel noise:

$G_i = -log(-log(U_i)), U_i \sim U(0,1)$

Second, add noise to x:

$v = [x_1 + G_1,...,x_n + G_n]$

Finally, calculate gumbel_softmax and generate samples:

$gumbel\_softmax(v_i)=\frac{e^{v_i/t}}{\sum_{j=1}^n{e^{v_j/t}}},i=1,2,3...n$
Parameters
• x (Tensor) – An N-D Tensor, the first N - 1 dimensions index into a batch of independent distributions and the last dimension represents a vector of probabilities with datatype float32, float64.

• temperature (float, optional) – non-negative scalar temperature. Default is 1.0.

• hard (bool, optional) – if True, the returned samples will be discretized as one-hot vectors, but will be differentiated as if it is the soft sample in autograd. Default is False.

• axis (int, optional) – The axis along will be calculated softmax value. Default is -1.

• name (str, optional) – Name for the operation (optional, default is None). For more information, please refer to Name.

Returns

Sampled tensor of same shape as x from the Gumbel-Softmax distribution. If hard = True, the returned samples will be one-hot, otherwise they will be probability distributions that sum to 1 across axis.

Examples

import paddle