A probability distribution assigns to every interval of the real numbers a probability, so that the probability axioms are satisfied. In technical terms, a probability distribution is a probability measure whose domain is the Borel algebra on the reals.
Every random variable gives rise to a probability distribution, and this distribution contains most of the important information about the variable. If X is a random variable, the corresponding probability distribution assigns to the interval [a, b] the probability Pr[a ≤ X ≤ b], i.e. the probability that the variable X will take a value in the interval [a, b].
The probability distribution of the variable X can be uniquely described by its cumulative distribution function F(x), which is defined by
- <math>
F(x) = {\rm Pr} \left[ X \le x \right]
</math>
for any x in R.
A distribution is called discrete if its cumulative distribution function consists of a sequence of finite jumps, which means that it belongs to a discrete random variable X: a variable which can only attain values from a certain finite or countable set.
A distribution is called continuous if its cumulative distribution function is continuous, which means that it belongs to a random variable X for which Pr[ X = x ] = 0 for all x in R.
The so-called absolutely continuous distributions can be expressed by a probability density function: a non-negative Lebesgue integrable function f defined on the reals such that
- <math>
{\rm Pr} \left[ a \le X \le b \right] = \int_a^b f(x)\,dx
</math>
for all a and b. That discrete distributions do not admit such a density is unsurprising, but there are continuous distributions like the devil's staircase that also do not admit a density.
The support of a distribution is the smallest closed set whose complement has probability zero.
Several probability distributions are so important that they have been given specific names:
- Discrete distributions
- With finite support
- The degenerate distribution at x0, where X is certain to take the value x0. This does not look random, but it satisfies the definition of random variable. This is useful because it puts deterministic variables and random variables in the same formalism.
- The discrete uniform distribution, where all elements of a finite set are equally likely. This is supposed to be the distribution of a balanced coin, an unbiased die, a casino roulette or a well-shuffled deck. Also, one can use measurements of quantum states to generate uniform random variables. All these are "physical" or "mechanical" devices, subject to design flaws or perturbations. In digital computers, pseudo-random number generators are used to produced a statistically random discrete uniform distribution.
- The Bernoulli distribution, which takes value 1 with probability p and value 0 with probability q=1-p.
- The binomial distribution, which describes the number of successes in a series of independent Yes/No experiments.
- The hypergeometric distribution, which describes the number of successes in the first m of a series of n independent Yes/No experiments, if the total number of successes is known.
- With infinite support
- The geometric distribution, a discrete distribution which describes the number of attempts needed to get the first success in a series of independent Yes/No experiments.
- The negative binomial distribution, a generalization of the geometric distribution to the nth success.
- The Poisson distribution, which describes the number of rare events that happen in a certain time interval.
- The Maxwell-Boltzmann distribution, a discrete distribution important in physics which describes the probabilities of the various energy levels of a system.
- The zeta distribution has uses in applied statistics, and perhaps may be of interest to number theorists.
- Continuous distributions
- Supported on a finite interval
- The uniform distribution on [a,b], where all points in a finite interval are equally likely.
- The Beta distribution on [0,1], of which the uniform distribution is a special case, and which is useful in estimating success probabilities.
- Supported on semi-infinite intervals, usually [0,∞)
- The exponential distribution, which describes the time between rare random events.
- The Gamma distribution, which describes the time until n rare random events occur.
- The Log-normal distribution, describing variables which can be modelled as the product of many small independent positive variables.
- The Weibull distribution, of which the exponential distribution is a special case, is used to model the lifetime of technical devices.
- The chi-square distribution, which is the sum of the squares of n independent Gaussian random variables. It is a special case of the Gamma distribution, and it is used in goodness-of-fit tests in statistics.
- Supported on the whole real line
- The normal distribution, also called the Gaussian or the bell curve. It is ubiquitous in nature and statistics due to the central limit theorem: every variable that can be modelled as a sum of many small independent variables is approximately normal.
- Student's t-distribution, useful for estimating unknown means of Gaussian populations.
- The Cauchy distribution, an example of a distribution which does not have an expected value or a variance. In physics it is usually called a Lorentzian, and it is the distribution of the energy of an unstable state in quantum mechanics. In particle physics, the extremely short-lived particles associated to unstable states are called resonances.
See also
probability applications -- random variable -- cumulative distribution function -- probability density function -- likelihood
All Wikipedia text
is available under the
terms of the GNU Free Documentation License