soft_margin_loss¶
- paddle.nn.functional. soft_margin_loss ( input, label, reduction='mean', name=None ) [source]
- 
         The API measures the soft margin loss between input predictions inputand target labelslabel. It can be described as:\[Out = log(1 + exp((-label * input)))\]- Parameters
- 
           - input (Tensor) – The input predications tensor with shape: - [N, *], N is batch_size, * means any number of additional dimensions. The- inputranges from -inf to inf. Available dtype is float32, float64.
- label (Tensor) – The target labels tensor with the same shape as - input. The target labels which values should be numbers -1 or 1. Available dtype is int32, int64, float32, float64.
- reduction (str, optional) – Indicate how to average the loss by batch_size, the candidates are - 'none'|- 'mean'|- 'sum'. If- reductionis- 'none', the unreduced loss is returned; If- reductionis- 'mean', the reduced mean loss is returned; If- reductionis- 'sum', the summed loss is returned. Default is- 'mean'.
- name (str, optional) – Name for the operation (optional, default is None). For more information, please refer to Name. 
 
- Returns
- 
           If reductionis'none', the shape of output is same asinput, else the shape of output is [1].
- Return type
- 
           Output (Tensor) 
 Examples import paddle input = paddle.to_tensor([[0.5, 0.6, 0.7],[0.3, 0.5, 0.2]], 'float32') label = paddle.to_tensor([[1.0, -1.0, 1.0],[-1.0, 1.0, 1.0]], 'float32') output = paddle.nn.functional.soft_margin_loss(input, label) print(output) # Tensor(shape=[1], dtype=float32, place=Place(gpu:0), stop_gradient=True, # [0.64022040]) input = paddle.uniform(shape=(5, 5), dtype="float32", min=0.1, max=0.8) label = paddle.randint(0, 2, shape=(5, 5), dtype="int64") label[label==0]=-1 output = paddle.nn.functional.soft_margin_loss(input, label, reduction='none') print(output) # Tensor(shape=[5, 5], dtype=float32, place=Place(gpu:0), stop_gradient=True, # [[1.09917796, 0.52613139, 0.56263304, 0.82736146, 0.38776723], # [1.07179427, 1.11924267, 0.49877715, 1.10026348, 0.46184641], # [0.84367639, 0.74795729, 0.44629076, 0.55123353, 0.77659678], # [0.39465919, 0.76651484, 0.54485321, 0.76609844, 0.77166790], # [0.51283568, 0.84757161, 0.78913331, 1.05268764, 0.45318675]]) 
