log_softmax

paddle.fluid.layers.log_softmax(input, axis=None, dtype=None, name=None)[source]

This operator implements the log_softmax layer. The calculation process is as follows:

\[Out[i, j] = log(softmax(x)) = log(\frac{\exp(X[i, j])}{\sum_j(exp(X[i, j])})\]
Parameters
  • input (Variable) – The input variable. A multi-dimension Tensor with type float32, or float64.

  • axis (int, optional) – The index of dimension to perform softmax calculations, it should be in range \([-1, rank-1]\), while \(rank\) is the rank of input variable. Default: None. None and -1 means the last dimension.

  • dtype (np.dtype|core.VarDesc.VarType|str) – The desired data type of returned tensor. If specified, the input tensor is casted to dtype before the operation is performed. This is useful for preventing data type overflows. Default: None. Supported dtype: float32 or float64

  • name (str, optional) – The default value is None. Normally there is no need for user to set this property. For more information, please refer to Name .

Returns

Tensor indicates the output of softmax. The data type and shape are the same as input.

Return type

Variable

Examples

import paddle.fluid as fluid
import numpy as np

data = np.array([[[-2.0, 3.0, -4.0, 5.0],
                  [3.0, -4.0, 5.0, -6.0],
                  [-7.0, -8.0, 8.0, 9.0]],
                 [[1.0, -2.0, -3.0, 4.0],
                  [-5.0, 6.0, 7.0, -8.0],
                  [6.0, 7.0, 8.0, 9.0]]]).astype('float32')
with fluid.dygraph.guard():
    data = fluid.dygraph.to_variable(data)
    res = fluid.layers.log_softmax(data, -1)
    # [[[ -7.1278396   -2.1278396   -9.127839    -0.12783948]
    #   [ -2.1270514   -9.127051    -0.12705144 -11.127051  ]
    #   [-16.313261   -17.313261    -1.3132617   -0.31326184]]
    #  [[ -3.0518122   -6.051812    -7.051812    -0.051812  ]
    #   [-12.313267    -1.3132664   -0.3132665  -15.313267  ]
    #   [ -3.4401896   -2.4401896   -1.4401896   -0.44018966]]]