Encyclopedia > Covariance

  Article Content

Covariance

In probability theory and statistics, the covariance between two random variables <math>X</math> and <math>Y</math>, with respective expected values <math>\mu</math> and <math>\nu</math> is:

<math>\operatorname{cov}(X, Y) = E((X - \mu) (Y - \nu))</math>
This is equivalent to:
<math>\operatorname{cov}(X, Y) = E(X Y) - \mu \nu</math>
a formula which is commonly used for calculation.

If <math>X</math> and <math>Y</math> are independent, then their covariance is zero. The converse however is not true. The covariance is sometimes called a measure of "linear dependence" between the two random variables. That phrase does not mean the same thing that is usually means in mathematics, as explicated in the Wikipedia article titled linear dependence, although the meaning is not unrelated. The correlation is a closely related concept used to measure the degree of linear dependence between two variables.



All Wikipedia text is available under the terms of the GNU Free Documentation License

 
  Search Encyclopedia

Search over one million articles, find something about almost anything!
 
 
  
  Featured Article
Jordanes

... and was a notary of Gothic kings in Italy. At the time of Justinian, he was a Christian and possibly bishop of Croton. In approximately 580, he wrote "De origine actibusque ...

 
 
 
This page was created in 25.8 ms