Uniform

class paddle.fluid.layers.distributions. Uniform ( low, high ) [source]

Uniform distribution with low and high parameters.

Mathematical Details

The probability density function (pdf) is,

\[\begin{split}pdf(x; a, b) = \\frac{1}{Z}, \ a <=x <b\end{split}\]
\[Z = b - a\]

In the above equation:

  • \(low = a\),

  • \(high = b\),

  • \(Z\): is the normalizing constant.

The parameters low and high must be shaped in a way that supports broadcasting (e.g., high - low is a valid operation).

Parameters
  • low (float|list|numpy.ndarray|Variable) – The lower boundary of uniform distribution.The data type is float32

  • high (float|list|numpy.ndarray|Variable) – The higher boundary of uniform distribution.The data type is float32

Examples

import numpy as np
from paddle.fluid import layers
from paddle.fluid.layers import Uniform

# Without broadcasting, a single uniform distribution [3, 4]:
u1 = Uniform(low=3.0, high=4.0)
# 2 distributions [1, 3], [2, 4]
u2 = Uniform(low=[1.0, 2.0],
              high=[3.0, 4.0])
# 4 distributions
u3 = Uniform(low=[[1.0, 2.0],
          [3.0, 4.0]],
     high=[[1.5, 2.5],
           [3.5, 4.5]])

# With broadcasting:
u4 = Uniform(low=3.0, high=[5.0, 6.0, 7.0])

# Complete example
value_npdata = np.array([0.8], dtype="float32")
value_tensor = layers.create_tensor(dtype="float32")
layers.assign(value_npdata, value_tensor)

uniform = Uniform([0.], [2.])

sample = uniform.sample([2])
# a random tensor created by uniform distribution with shape: [2, 1]
entropy = uniform.entropy()
# [0.6931472] with shape: [1]
lp = uniform.log_prob(value_tensor)
# [-0.6931472] with shape: [1]
sample ( shape, seed=0 )

Generate samples of the specified shape.

Parameters
  • shape (list) – 1D int32. Shape of the generated samples.

  • seed (int) – Python integer number.

Returns

A tensor with prepended dimensions shape.The data type is float32.

Return type

Variable

log_prob ( value )

Log probability density/mass function.

Parameters

value (Variable) – The input tensor.

Returns

log probability.The data type is same with value.

Return type

Variable

entropy ( )

Shannon entropy in nats.

Returns

Shannon entropy of uniform distribution.The data type is float32.

Return type

Variable

kl_divergence ( other )

The KL-divergence between self distributions and other.