Encyclopedia > Covariance

  Article Content

Covariance

In probability theory and statistics, the covariance between two random variables <math>X</math> and <math>Y</math>, with respective expected values <math>\mu</math> and <math>\nu</math> is:

<math>\operatorname{cov}(X, Y) = E((X - \mu) (Y - \nu))</math>
This is equivalent to:
<math>\operatorname{cov}(X, Y) = E(X Y) - \mu \nu</math>
a formula which is commonly used for calculation.

If <math>X</math> and <math>Y</math> are independent, then their covariance is zero. The converse however is not true. The covariance is sometimes called a measure of "linear dependence" between the two random variables. That phrase does not mean the same thing that is usually means in mathematics, as explicated in the Wikipedia article titled linear dependence, although the meaning is not unrelated. The correlation is a closely related concept used to measure the degree of linear dependence between two variables.



All Wikipedia text is available under the terms of the GNU Free Documentation License

 
  Search Encyclopedia

Search over one million articles, find something about almost anything!
 
 
  
  Featured Article
East Farmingdale, New York

... makeup of the town is 73.33% White, 14.83% African American, 0.15% Native American, 4.09% Asian, 0.09% Pacific Islander, 4.52% from other races, and 2.98% from tw ...

 
 
 
This page was created in 37.6 ms