# cross_entropy¶

paddle.fluid.layers.cross_entropy(input, label, soft_label=False, ignore_index=-100)[source]

This operator computes the cross entropy between input and label. It supports both hard-label and and soft-label cross entropy computation.

1. Hard-label cross entropy: if soft_label=False, $$label[i_1, i_2, ..., i_k]$$ is the hard label of each sample.

$output[i_1, i_2, ..., i_k]=-log(input[i_1, i_2, ..., i_k, j]), label[i_1, i_2, ..., i_k] = j, j != ignore\_index$
2. Soft-label cross entropy: if soft_label=True, $$label[i_1, i_2, ..., i_k, j]$$ is the soft label of each sample corresponding to the j-th class.

$output[i_1, i_2, ..., i_k]= -\sum_{j}label[i_1,i_2,...,i_k,j]*log(input[i_1, i_2, ..., i_k,j])$
Parameters
• input (Variable) – a multidimensional Tensor with shape $$[N_1, N_2, ..., N_k, D]$$, where the last dimension D is the class number. The data type should be float32 or float64.

• label (Variable) – label value corresponding to input. If soft_label=False, the dimension of label should be $$[N_1, N_2, ..., N_k]$$ or $$[N_1, N_2, ..., N_k, 1]$$ , and its data type should be int64, and the value must be inside [0, D). If soft_label=True, the shape, data type of label should be the same with input, and the sum of soft label value of each sample should be 1.

• soft_label (bool) – indicate whether label is soft. Default False, meaning that the label is hard. If soft_label=True, the label is soft.

• ignore_index (int) – specify an ignorable label value. The ignored label would be omitted when computing. If it is a negative integer, no label would be ignored. Only valid when soft_label=False. Default -100.

Returns

A Variable holding Tensor representing the cross entropy, whose data type is the same with input. If soft_label=False, the shape of output is the same with label. If soft_label=True, the shape of output is $$[N_1, N_2, ..., N_k, 1]$$ .

Examples

import paddle.fluid as fluid
class_num = 7
x = fluid.data(name='x', shape=[None, 3, 10], dtype='float32')
label = fluid.data(name='label', shape=[None, 1], dtype='int64')
predict = fluid.layers.fc(input=x, size=class_num, act='softmax')
cost = fluid.layers.cross_entropy(input=predict, label=label)