In
probability theory,
Markov's inequality gives an
upper bound for the
probability that a
non-negative function of a
random variable is greater than or equal to some positive
constant. It is named after the Russian mathematician
Andrey Markov.
Markov's inequality (and other similar inequalities) relate probabilities to expectations, and provide (frequently) loose but still useful bounds for the distribution function of a random variable.
Definition
Markov's inequality states that if X is a random variable and a is some positive constant, then
- <math>\textrm{Pr}(|X| \geq a) \leq \frac{\textrm{E}(|X|)}{a}.</math>
A Generalisation
Markov's inequality is actually just one of a wider class of inequalities relating probabilities and expectations, that are all examples of a single theorem.
Let X be a random variable and a be some positive constant (a > 0). If
- <math>h:\mathbb{R} \rightarrow [0,\infty),</math>
then
- <math>\textrm{Pr}(h(X) \geq a) \leq \frac{\textrm{E}(h(X))}{a}.</math>
Let A be the set {x: h(x) ≥ a}, and let IA(x) be the indicator function of A. (That is, IA(x) = 1 if x ∈ A, and is 0 otherwise.) Then,
- <math>aI_A(x) \leq h(x).</math>
The theorem follows by taking the expectation of both sides of this equation, and observing that
- <math>\textrm{E}(I_A(X)) = \textrm{Pr}(h(X) \geq a).</math>
- Markov's inequality is recovered by setting h(x) = |x|.
- If h(x) = x2, we obtain a version of Chebyshev's inequality.
All Wikipedia text
is available under the
terms of the GNU Free Documentation License