Encyclopedia > Covariance

  Article Content

Covariance

In probability theory and statistics, the covariance between two random variables <math>X</math> and <math>Y</math>, with respective expected values <math>\mu</math> and <math>\nu</math> is:

<math>\operatorname{cov}(X, Y) = E((X - \mu) (Y - \nu))</math>
This is equivalent to:
<math>\operatorname{cov}(X, Y) = E(X Y) - \mu \nu</math>
a formula which is commonly used for calculation.

If <math>X</math> and <math>Y</math> are independent, then their covariance is zero. The converse however is not true. The covariance is sometimes called a measure of "linear dependence" between the two random variables. That phrase does not mean the same thing that is usually means in mathematics, as explicated in the Wikipedia article titled linear dependence, although the meaning is not unrelated. The correlation is a closely related concept used to measure the degree of linear dependence between two variables.



All Wikipedia text is available under the terms of the GNU Free Documentation License

 
  Search Encyclopedia

Search over one million articles, find something about almost anything!
 
 
  
  Featured Article
Islip Terrace, New York

... of the population are Hispanic or Latino of any race. There are 1,755 households out of which 43.6% have children under the age of 18 living with them, 67.6% are married ...

 
 
 
This page was created in 31.9 ms