World Library  
Flag as Inappropriate
Email this Article

Continuous uniform distribution

Article Id: WHEBN0001703044
Reproduction Date:

Title: Continuous uniform distribution  
Author: World Heritage Encyclopedia
Language: English
Subject: Disperser, Probability distribution, White noise
Publisher: World Heritage Encyclopedia

Continuous uniform distribution

Template:Probability distribution{t(b-a)}

 |char       = \frac{\mathrm{e}^{itb}-\mathrm{e}^{ita}}{it(b-a)}


In probability theory and statistics, the continuous uniform distribution or rectangular distribution is a family of symmetric probability distributions such that for each member of the family, all intervals of the same length on the distribution's support are equally probable. The support is defined by the two parameters, a and b, which are its minimum and maximum values. The distribution is often abbreviated U(a,b). It is the maximum entropy probability distribution for a random variate X under no constraint other than that it is contained in the distribution's support.[1]


Probability density function

The probability density function of the continuous uniform distribution is:

 \frac{1}{b - a} & \mathrm{for}\ a \le x \le b, \\[8pt]
 0 & \mathrm{for}\ xb

The values of f(x) at the two boundaries a and b are usually unimportant because they do not alter the values of the integrals of f(xdx over any interval, nor of x f(xdx or any higher moment. Sometimes they are chosen to be zero, and sometimes chosen to be 1/(b − a). The latter is appropriate in the context of estimation by the method of maximum likelihood. In the context of Fourier analysis, one may take the value of f(a) or f(b) to be 1/(2(b − a)), since then the inverse transform of many integral transforms of this uniform function will yield back the function itself, rather than a function which is equal "almost everywhere", i.e. except on a set of points with zero measure. Also, it is consistent with the sign function which has no such ambiguity.

In terms of mean μ and variance σ2, the probability density may be written as:

\frac {1}{2 \sigma \sqrt{3}} & \mbox{for }-\sigma\sqrt{3} \le x-\mu \le \sigma\sqrt{3} \\
0 & \text{otherwise}

Cumulative distribution function

The cumulative distribution function is:

 F(x)= \begin{cases}
 0 & \text{for }x < a \\[8pt]
 \frac{x-a}{b-a} & \text{for }a \le x < b \\[8pt]
 1 & \text{for }x \ge b

Its inverse is:

F^{-1}(p) = a + p (b - a) \,\,\text{ for } 0

In mean and variance notation, the cumulative distribution function is:

F(x)= \begin{cases}

0 & \text{for }x-\mu < -\sigma\sqrt{3} \\ \frac{1}{2} \left( \frac{x-\mu}{\sigma \sqrt{3}} +1 \right) & \text{for }-\sigma\sqrt{3} \le x-\mu < \sigma\sqrt{3} \\ 1 & \text{for }x-\mu \ge \sigma\sqrt{3} \end{cases}

and the inverse is:

F^{-1}(p) = \sigma\sqrt{3}(2p-1) +\mu\,\, \text{ for }0 \le p \le 1

Generating functions

Moment-generating function

The moment-generating function is:[2]

M_x = E(e^{tx}) = \frac{e^{tb}-e^{ta}}{t(b-a)} \,\!

from which we may calculate the raw moments m k

m_1=\frac{a+b}{2}, \,\!
m_2=\frac{a^2+ab+b^2}{3}, \,\!
m_k=\frac{1}{k+1}\sum_{i=0}^k a^ib^{k-i}. \,\!

For a random variable following this distribution, the expected value is then m1 = (a + b)/2 and the variance is m2 − m12 = (b − a)2/12.

Cumulant-generating function

For n ≥ 2, the nth cumulant of the uniform distribution on the interval [0, 1] is bn/n, where bn is the nth Bernoulli number.


Moments and parameters

The first two moments of the distribution are:


Solving these two equations for parameters a and b, given known moments E(X) and V(X), yields:


Order statistics

Let X1, ..., Xn be an i.i.d. sample from U(0,1). Let X(k) be the kth order statistic from this sample. Then the probability distribution of X(k) is a Beta distribution with parameters k and n − k + 1. The expected value is

\operatorname{E}(X_{(k)}) = {k \over n+1}.

This fact is useful when making Q-Q plots.

The variances are

\operatorname{Var}(X_{(k)}) = {k (n-k+1) \over (n+1)^2 (n+2)} .


The probability that a uniformly distributed random variable falls within any interval of fixed length is independent of the location of the interval itself (but it is dependent on the interval size), so long as the interval is contained in the distribution's support.

To see this, if X ~ U(a,b) and [x, x+d] is a subinterval of [a,b] with fixed d > 0, then

 P\left(X\in\left [ x,x+d \right ]\right) 
 = \int_{x}^{x+d} \frac{\mathrm{d}y}{b-a}\,
 = \frac{d}{b-a} \,\!

which is independent of x. This fact motivates the distribution's name.

Generalization to Borel sets

This distribution can be generalized to more complicated sets than intervals. If S is a Borel set of positive, finite measure, the uniform probability distribution on S can be specified by defining the pdf to be zero outside S and constantly equal to 1/K on S, where K is the Lebesgue measure of S.

Standard uniform

Restricting a=0 and b=1, the resulting distribution U(0,1) is called a standard uniform distribution.

One interesting property of the standard uniform distribution is that if u1 has a standard uniform distribution, then so does 1-u1. This property can be used for generating antithetic variates, among other things.

Related distributions

Relationship to other functions

As long as the same conventions are followed at the transition points, the probability density function may also be expressed in terms of the Heaviside step function:

f(x)=\frac{\operatorname{H}(x-a)-\operatorname{H}(x-b)}{b-a}, \,\!

or in terms of the rectangle function

f(x)=\frac{1}{b-a}\,\operatorname{rect}\left(\frac{x-\left(\frac{a+b}{2}\right)}{b-a}\right) .

There is no ambiguity at the transition point of the sign function. Using the half-maximum convention at the transition points, the uniform distribution may be expressed in terms of the sign function as:

f(x)=\frac{ \sgn{(x-a)}-\sgn{(x-b)}} {2(b-a)}.


In statistics, when a p-value is used as a test statistic for a simple null hypothesis, and the distribution of the test statistic is continuous, then the p-value is uniformly distributed between 0 and 1 if the null hypothesis is true.

Sampling from a uniform distribution

There are many applications in which it is useful to run simulation experiments. Many programming languages have the ability to generate pseudo-random numbers which are effectively distributed according to the standard uniform distribution.

If u is a value sampled from the standard uniform distribution, then the value a + (ba)u follows the uniform distribution parametrised by a and b, as described above.

Sampling from an arbitrary distribution

The uniform distribution is useful for sampling from arbitrary distributions. A general method is the inverse transform sampling method, which uses the cumulative distribution function (CDF) of the target random variable. This method is very useful in theoretical work. Since simulations using this method require inverting the CDF of the target variable, alternative methods have been devised for the cases where the cdf is not known in closed form. One such method is rejection sampling.

The normal distribution is an important example where the inverse transform method is not efficient. However, there is an exact method, the Box–Muller transformation, which uses the inverse transform to convert two independent uniform random variables into two independent normally distributed random variables.

Quantization error

Main article: Quantization error

In analog-to-digital conversion a quantization error occurs. This error is either due to rounding or truncation. When the original signal is much larger than one least significant bit (LSB), the quantization error is not significantly correlated with the signal, and has an approximately uniform distribution. The RMS error therefore follows from the variance of this distribution.


Estimation of maximum

Main article: German tank problem

Given a uniform distribution on [0, N] with unknown N, the UMVU estimator for the maximum is given by

\hat{N}=\frac{k+1}{k} m = m + \frac{m}{k}

where m is the sample maximum and k is the sample size, sampling without replacement (though this distinction almost surely makes no difference for a continuous distribution). This follows for the same reasons as estimation for the discrete distribution, and can be seen as a very simple case of maximum spacing estimation. This problem is commonly known as the German tank problem, due to application of maximum estimation to estimates of German tank production during World War II.

Estimation of midpoint

The midpoint of the distribution (a + b) / 2 is both the mean and the median of the uniform distribution. Although both the sample mean and the sample median are unbiased estimators of the midpoint, neither is as efficient as the sample mid-range, i.e. the arithmetic mean of the sample maximum and the sample minimum, which is the UMVU estimator of the midpoint (and also the maximum likelihood estimate).

Confidence interval for the maximum

Let X1, X2, X3, ..., Xn be a sample from U( 0, L ) where L is the population maximum. Then X(n) = max( X1, X2, X3, ..., Xn ) has the density[3]

f_n( X_{(n)} ) = n \frac{1}{L} (\frac{ X_{(n)} }{ L })^{ n - 1 } =n \frac{ X_{(n)}^{ n - 1 } }{ L^n }, 0 < X_n < L

The confidence interval for the estimated population maximum is then ( X(n), X(n) / α1/n ) where 100 ( 1 - α )% is the confidence level sought. In symbols

X_{(n)} \le L \le X_{(n)} / \alpha^{ 1 / n }

See also



External links

  • Online calculator of Uniform distribution (continuous)


hu:Egyenletes eloszlás

su:Sebaran seragam#Kasus kontinyu

This article was sourced from Creative Commons Attribution-ShareAlike License; additional terms may apply. World Heritage Encyclopedia content is assembled from numerous content providers, Open Access Publishing, and in compliance with The Fair Access to Science and Technology Research Act (FASTR), Wikimedia Foundation, Inc., Public Library of Science, The Encyclopedia of Life, Open Book Publishers (OBP), PubMed, U.S. National Library of Medicine, National Center for Biotechnology Information, U.S. National Library of Medicine, National Institutes of Health (NIH), U.S. Department of Health & Human Services, and, which sources content from all federal, state, local, tribal, and territorial government publication portals (.gov, .mil, .edu). Funding for and content contributors is made possible from the U.S. Congress, E-Government Act of 2002.
Crowd sourced content that is contributed to World Heritage Encyclopedia is peer reviewed and edited by our editorial staff to ensure quality scholarly research articles.
By using this site, you agree to the Terms of Use and Privacy Policy. World Heritage Encyclopedia™ is a registered trademark of the World Public Library Association, a non-profit organization.

Copyright © World Library Foundation. All rights reserved. eBooks from Project Gutenberg are sponsored by the World Library Foundation,
a 501c(4) Member's Support Non-Profit Organization, and is NOT affiliated with any governmental agency or department.