kl_divergence

paddle.distribution. kl_divergence ( p, q ) [source]

Kullback-Leibler divergence between distribution p and q.

\[KL(p||q) = \int p(x)log\frac{p(x)}{q(x)} \mathrm{d}x\]
Parameters
  • p (Distribution) – Distribution object. Inherits from the Distribution Base class.

  • q (Distribution) – Distribution object. Inherits from the Distribution Base class.

Returns

Tensor, Batchwise KL-divergence between distribution p and q.

Examples

>>> import paddle

>>> p = paddle.distribution.Beta(alpha=0.5, beta=0.5)
>>> q = paddle.distribution.Beta(alpha=0.3, beta=0.7)

>>> print(paddle.distribution.kl_divergence(p, q))
Tensor(shape=[], dtype=float32, place=Place(cpu), stop_gradient=True,
    0.21193528)