Encyclopedia > Covariance

  Article Content

Covariance

In probability theory and statistics, the covariance between two random variables <math>X</math> and <math>Y</math>, with respective expected values <math>\mu</math> and <math>\nu</math> is:

<math>\operatorname{cov}(X, Y) = E((X - \mu) (Y - \nu))</math>
This is equivalent to:
<math>\operatorname{cov}(X, Y) = E(X Y) - \mu \nu</math>
a formula which is commonly used for calculation.

If <math>X</math> and <math>Y</math> are independent, then their covariance is zero. The converse however is not true. The covariance is sometimes called a measure of "linear dependence" between the two random variables. That phrase does not mean the same thing that is usually means in mathematics, as explicated in the Wikipedia article titled linear dependence, although the meaning is not unrelated. The correlation is a closely related concept used to measure the degree of linear dependence between two variables.



All Wikipedia text is available under the terms of the GNU Free Documentation License

 
  Search Encyclopedia

Search over one million articles, find something about almost anything!
 
 
  
  Featured Article
Rameses

... spelled Rameses, is the name of several Egyptian pharaohs: Ramses I[?] Ramses II ("The Great") Ramses III Ramses IV[?] The name means "Child of the Sun". Th ...

 
 
 
This page was created in 37 ms