soft_margin_loss

paddle.nn.functional. soft_margin_loss ( input, label, reduction='mean', name=None ) [source]

The API measures the soft margin loss between input predictions input and target labels label . It can be described as:

\[Out = log(1 + exp((-label * input)))\]
Parameters
  • input (Tensor) – The input predications tensor with shape: [N, *], N is batch_size, * means any number of additional dimensions. The input ranges from -inf to inf. Available dtype is float32, float64.

  • label (Tensor) – The target labels tensor with the same shape as input. The target labels which values should be numbers -1 or 1. Available dtype is int32, int64, float32, float64.

  • reduction (str, optional) – Indicate how to average the loss by batch_size, the candidates are 'none' | 'mean' | 'sum'. If reduction is 'none', the unreduced loss is returned; If reduction is 'mean', the reduced mean loss is returned; If reduction is 'sum', the summed loss is returned. Default is 'mean'.

  • name (str, optional) – Name for the operation (optional, default is None). For more information, please refer to Name.

Returns

If reduction is 'none', the shape of output is same as input , else the shape of output is [].

Return type

Output (Tensor)

Examples

>>> import paddle
>>> paddle.seed(2023)

>>> input = paddle.to_tensor([[0.5, 0.6, 0.7],[0.3, 0.5, 0.2]], 'float32')
>>> label = paddle.to_tensor([[1.0, -1.0, 1.0],[-1.0, 1.0, 1.0]], 'float32')
>>> output = paddle.nn.functional.soft_margin_loss(input, label)
>>> print(output)
Tensor(shape=[], dtype=float32, place=Place(cpu), stop_gradient=True,
       0.64022040)

>>> input = paddle.uniform(shape=(5, 5), dtype="float32", min=0.1, max=0.8)
>>> label = paddle.randint(0, 2, shape=(5, 5), dtype="int64")
>>> label[label==0] = -1

>>> output = paddle.nn.functional.soft_margin_loss(input, label, reduction='none')
>>> print(output)
Tensor(shape=[5, 5], dtype=float32, place=Place(cpu), stop_gradient=True,
       [[1.10725629, 0.48778144, 0.56217247, 1.12581408, 0.51430041],
        [0.90375793, 0.37761253, 0.43007556, 0.95089805, 0.43288314],
        [1.16043591, 0.63015938, 0.51362717, 0.43617544, 0.57783306],
        [0.81927848, 0.52558368, 0.59713912, 0.83100700, 0.50811619],
        [0.82684207, 1.02064908, 0.50296998, 1.13461733, 0.93222517]])