MultiLabelSoftMarginLoss¶
- class paddle.nn. MultiLabelSoftMarginLoss ( weight=None, reduction='mean', name=None ) [source]
- 
         Creates a criterion that optimizes a multi-class multi-classification hinge loss (margin-based loss) between input \(x\) (a 2D mini-batch Tensor) and output \(y\) (which is a 2D Tensor of target class indices). For each sample in the mini-batch: \[\text{loss}(x, y) = \sum_{ij}\frac{\max(0, 1 - (x[y[j]] - x[i]))}{\text{x.size}(0)}\]where \(x \in \left\{0, \; \cdots , \; \text{x.size}(0) - 1\right\}\), \(y \in \left\{0, \; \cdots , \; \text{y.size}(0) - 1\right\}\), \(0 \leq y[j] \leq \text{x.size}(0)-1\), and \(i \neq y[j]\) for all \(i\) and \(j\). \(y\) and \(x\) must have the same size. - Parameters
- 
           weight – a manual rescaling weight given to each class. If given, has to be a Tensor of size C and the data type is float32, float64. Default is 'None'.
 - Call parameters:
- 
           input (Tensor): Input tensor, the data type is float32 or float64. Shape is (N, C), where C is number of classes, and if shape is more than 2D, this is (N, C, D1, D2,…, Dk), k >= 1. label (Tensor): Label tensor containing 1 or -1, the data type is float32 or float64. The shape of label is the same as the shape of input. 
- Shape:
- 
           input: N-D Tensor, the shape is [N, *], N is batch size and * means number of classes, available dtype is float32, float64. The sum operationoperates over all the elements. label: N-D Tensor, same shape as the input. output: scalar. If reductionis'none', then same shape as the input.
 - Returns
- 
           A callable object of MultiLabelSoftMarginLoss. 
 Examples import paddle import paddle.nn as nn input = paddle.to_tensor([[1, -2, 3], [0, -1, 2], [1, 0, 1]], dtype=paddle.float32) label = paddle.to_tensor([[-1, 1, -1], [1, 1, 1], [1, -1, 1]], dtype=paddle.float32) multi_label_soft_margin_loss = nn.MultiLabelSoftMarginLoss(reduction='none') loss = multi_label_soft_margin_loss(input, label) print(loss) # Tensor([3.49625897, 0.71111226, 0.43989015]) multi_label_soft_margin_loss = nn.MultiLabelSoftMarginLoss(reduction='mean') loss = multi_label_soft_margin_loss(input, label) print(loss) # Tensor([1.54908717]) - 
            
           forward
           (
           input, 
           label
           )
           forward¶
- 
           Defines the computation performed at every call. Should be overridden by all subclasses. - Parameters
- 
             - *inputs (tuple) – unpacked tuple arguments 
- **kwargs (dict) – unpacked dict arguments 
 
 
 
