Encyclopedia > Covariance

  Article Content

Covariance

In probability theory and statistics, the covariance between two random variables <math>X</math> and <math>Y</math>, with respective expected values <math>\mu</math> and <math>\nu</math> is:

<math>\operatorname{cov}(X, Y) = E((X - \mu) (Y - \nu))</math>
This is equivalent to:
<math>\operatorname{cov}(X, Y) = E(X Y) - \mu \nu</math>
a formula which is commonly used for calculation.

If <math>X</math> and <math>Y</math> are independent, then their covariance is zero. The converse however is not true. The covariance is sometimes called a measure of "linear dependence" between the two random variables. That phrase does not mean the same thing that is usually means in mathematics, as explicated in the Wikipedia article titled linear dependence, although the meaning is not unrelated. The correlation is a closely related concept used to measure the degree of linear dependence between two variables.



All Wikipedia text is available under the terms of the GNU Free Documentation License

 
  Search Encyclopedia

Search over one million articles, find something about almost anything!
 
 
  
  Featured Article
Brookline, Massachusetts

... of contents 1 Government 2 History 3 Geography 4 Demographics Government Brookline is governed by a representative (elected) Town ...