Encyclopedia > Markov's inequality

  Article Content

Markov's inequality

In probability theory, Markov's inequality gives an upper bound for the probability that a non-negative function of a random variable is greater than or equal to some positive constant. It is named after the Russian mathematician Andrey Markov.

Markov's inequality (and other similar inequalities) relate probabilities to expectations, and provide (frequently) loose but still useful bounds for the distribution function of a random variable.

Table of contents

Definition

Markov's inequality states that if X is a random variable and a is some positive constant, then

<math>\textrm{Pr}(|X| \geq a) \leq \frac{\textrm{E}(|X|)}{a}.</math>

A Generalisation

Markov's inequality is actually just one of a wider class of inequalities relating probabilities and expectations, that are all examples of a single theorem.

Theorem

Let X be a random variable and a be some positive constant (a > 0). If

<math>h:\mathbb{R} \rightarrow [0,\infty),</math>
then
<math>\textrm{Pr}(h(X) \geq a) \leq \frac{\textrm{E}(h(X))}{a}.</math>

Proof

Let A be the set {x: h(x) ≥ a}, and let IA(x) be the indicator function of A. (That is, IA(x) = 1 if xA, and is 0 otherwise.) Then,

<math>aI_A(x) \leq h(x).</math>
The theorem follows by taking the expectation of both sides of this equation, and observing that
<math>\textrm{E}(I_A(X)) = \textrm{Pr}(h(X) \geq a).</math>

Examples

  • Markov's inequality is recovered by setting h(x) = |x|.
  • If h(x) = x2, we obtain a version of Chebyshev's inequality.



All Wikipedia text is available under the terms of the GNU Free Documentation License

 
  Search Encyclopedia

Search over one million articles, find something about almost anything!
 
 
  
  Featured Article
Northampton, Suffolk County, New York

... Native American, 0.85% Asian, 0.00% Pacific Islander, 2.35% from other races, and 5.34% from two or more races. 8.33% of the population are Hispanic or Latino of any ...

 
 
 
This page was created in 1171.3 ms