kl_divergence¶
- paddle.distribution. kl_divergence ( p, q ) [source]
- 
         Kullback-Leibler divergence between distribution p and q. \[KL(p||q) = \int p(x)log\frac{p(x)}{q(x)} \mathrm{d}x\]- Parameters
- 
           - p (Distribution) – - Distributionobject. Inherits from the Distribution Base class.
- q (Distribution) – - Distributionobject. Inherits from the Distribution Base class.
 
- Returns
- 
           Tensor, Batchwise KL-divergence between distribution p and q. 
 Examples import paddle p = paddle.distribution.Beta(alpha=0.5, beta=0.5) q = paddle.distribution.Beta(alpha=0.3, beta=0.7) print(paddle.distribution.kl_divergence(p, q)) # Tensor(shape=[1], dtype=float32, place=CUDAPlace(0), stop_gradient=True, # [0.21193528]) 
