[ 仅 API 调用方式不一致 ]torch.distributions.kl.kl_divergence
torch.distributions.kl.kl_divergence
torch.distributions.kl.kl_divergence(p, q)
转写示例
# PyTorch 写法
result = torch.distributions.kl.kl_divergence(m, n)
# Paddle 写法
result = paddle.distribution.kl_divergence(m, n)