kl_divergence

paddle.distribution. kl_divergence ( p, q ) [source]

Kullback-Leibler divergence between distribution p and q.

\[KL(p||q) = \int p(x)log\frac{p(x)}{q(x)} \mathrm{d}x\]
Parameters
  • p (Distribution) – Distribution object.

  • q (Distribution) – Distribution object.

Returns

Batchwise KL-divergence between distribution p and q.

Return type

Tensor

Examples

import paddle

p = paddle.distribution.Beta(alpha=0.5, beta=0.5)
q = paddle.distribution.Beta(alpha=0.3, beta=0.7)

print(paddle.distribution.kl_divergence(p, q))
# Tensor(shape=[1], dtype=float32, place=CUDAPlace(0), stop_gradient=True,
#        [0.21193528])