cosine_embedding_loss¶
- paddle.nn.functional. cosine_embedding_loss ( input1, input2, label, margin=0, reduction='mean', name=None ) [source]
-
This operator computes the cosine embedding loss of Tensor
input1
,input2
andlabel
as follows.If label = 1, then the loss value can be calculated as follow:
\[Out = 1 - cos(input1, input2)\]If label = -1, then the loss value can be calculated as follow:
\[Out = max(0, cos(input1, input2)) - margin\]- The operator cos can be described as follow:
-
\[cos(x1, x2) = \frac{x1 \cdot{} x2}{\Vert x1 \Vert_2 * \Vert x2 \Vert_2}\]
- Parameters:
-
- input1 (Tensor): tensor with shape: [N, M] or [M], ‘N’ means batch size, ‘M’ means the length of input array.
-
Available dtypes are float32, float64.
- input2 (Tensor): tensor with shape: [N, M] or [M], ‘N’ means batch size, ‘M’ means the length of input array.
-
Available dtypes are float32, float64.
- label (Tensor): tensor with shape: [N] or [1]. The target labels values should be -1 or 1.
-
Available dtypes are int32, int64, float32, float64.
- margin (float, optional): Should be a number from \(-1\) to \(1\),
-
\(0\) to \(0.5\) is suggested. If
margin
is missing, the default value is \(0\). - reduction (string, optional): Specifies the reduction to apply to the output:
-
'none'
|'mean'
|'sum'
.'none'
: no reduction will be applied,'mean'
: the sum of the output will be divided by the number of elements in the output'sum'
: the output will be summed. - name (str, optional): Name for the operation (optional, default is None).
-
For more information, please refer to Name.
- Returns
-
-
Tensor, the cosine embedding Loss of Tensor
input1
input2
andlabel
. -
If reduction is
'none'
, the shape of output loss is [N], the same asinput
. If reduction is'mean'
or'sum'
, the shape of output loss is [1].
-
Tensor, the cosine embedding Loss of Tensor
Examples
import paddle input1 = paddle.to_tensor([[1.6, 1.2, -0.5], [3.2, 2.6, -5.8]], 'float32') input2 = paddle.to_tensor([[0.5, 0.5, -1.8], [2.3, -1.4, 1.1]], 'float32') label = paddle.to_tensor([1, -1], 'int64') output = paddle.nn.functional.cosine_embedding_loss(input1, input2, label, margin=0.5, reduction='mean') print(output) # [0.21155193] output = paddle.nn.functional.cosine_embedding_loss(input1, input2, label, margin=0.5, reduction='sum') print(output) # [0.42310387] output = paddle.nn.functional.cosine_embedding_loss(input1, input2, label, margin=0.5, reduction='none') print(output) # [0.42310387, 0. ]