multi_label_soft_margin_loss¶
- paddle.nn.functional. multi_label_soft_margin_loss ( input: Tensor, label: Tensor, weight: Tensor | None = None, reduction: _ReduceMode = 'mean', name: str | None = None ) Tensor [source]
-
Calculate a multi-class multi-classification hinge loss (margin-based loss) between input x (a 2D mini-batch Tensor) and output y (which is a 2D Tensor of target class indices). For each sample in the mini-batch:
loss(x,y)=−1C∗∑iy[i]∗log((1+exp(−x[i]))−1)+(1−y[i])∗log(exp(−x[i])(1+exp(−x[i])))where i∈{0,⋯,x.nElement()−1}, y[i]∈{0,1}.
- Parameters
-
input (Tensor) – Input tensor, the data type is float32 or float64. Shape is (N, C), where C is number of classes, and if shape is more than 2D, this is (N, C, D1, D2,…, Dk), k >= 1.
label (Tensor) – Label tensor, the data type is float32 or float64. The shape of label is the same as the shape of input.
weight (Tensor,optional) – a manual rescaling weight given to each class. If given, has to be a Tensor of size C and the data type is float32, float64. Default is
'None'
.reduction (str, optional) – Indicate how to average the loss by batch_size, the candidates are
'none'
|'mean'
|'sum'
. Ifreduction
is'none'
, the unreduced loss is returned; Ifreduction
is'mean'
, the reduced mean loss is returned; Ifreduction
is'sum'
, the summed loss is returned. Default:'mean'
name (str|None, optional) – Name for the operation (optional, default is None). For more information, please refer to Name.
- Shape:
-
input: N-D Tensor, the shape is [N, *], N is batch size and * means number of classes, available dtype is float32, float64. The sum operation operates over all the elements. label: N-D Tensor, same shape as the input. weight:N-D Tensor, the shape is [N,1] output: scalar. If
reduction
is'none'
, then same shape as the input.
- Returns
-
Tensor, The tensor variable storing the multi_label_soft_margin_loss of input and label.
Examples
>>> import paddle >>> import paddle.nn.functional as F >>> input = paddle.to_tensor([[1, -2, 3], [0, -1, 2], [1, 0, 1]], dtype=paddle.float32) >>> # label elements in {1., -1.} >>> label = paddle.to_tensor([[-1, 1, -1], [1, 1, 1], [1, -1, 1]], dtype=paddle.float32) >>> loss = F.multi_label_soft_margin_loss(input, label, reduction='none') >>> print(loss) Tensor(shape=[3], dtype=float32, place=Place(cpu), stop_gradient=True, [3.49625897, 0.71111226, 0.43989015]) >>> loss = F.multi_label_soft_margin_loss(input, label, reduction='mean') >>> print(loss) Tensor(shape=[], dtype=float32, place=Place(cpu), stop_gradient=True, 1.54908717)