Encyclopedia > Expected value

  Article Content

Expected value

In general expectation is what is considered the most likely to happen. A less advantageous result gives rise to the emotion of disappointment.

In probability (and especially gambling), the expected value (or expectation) of a random variable is the sum of the probability of each possible outcome of the experiment multiplied by its payoff ("value"). Thus, it represents the average amount one "expects" to win per bet if bets with identical odds are repeated many times. Note that the value itself may not be expected in the general sense, it may be unlikely or even impossible.

For example, an American Roulette wheel has 38 equally possible outcomes. A bet placed on a single number pays 35-to-1 (this means that he is paid 35 times his bet, while also his bet is returned, together he gets 36 times his bet). So the expected value of the profit resulting from a $1 bet on a single number is, considering all 38 possible outcomes: ( -1 × 37/38 ) + ( 35 × 1/38 ), which is about -0.0526. Therefore one expects, on average, to lose over 5 cents for every dollar bet.

In general, if X is a random variable defined on a probability space (Ω, P), then the expected value EX of X is defined as

<math>\operatorname{E}X = \int_\Omega X dP</math>

where the Lebesgue integral is employed. Note that not all random variables have an expected value, since the integral may not exist. Two variables with the same probability distribution will have the same expected value.

If X is a discrete random variable with values x1, x2, ... and corresponding probabilities p1, p2, ... which add up to 1, then EX can be computed as the sum or series

<math>\operatorname{E}X = \sum_i p_i x_i</math>

as in the gambling example mentioned above.

If the probability distribution of X admits a probability density function f(x), then the expected value can be computed as

<math>\operatorname{E}X = \int_{-\infty}^\infty x f(x) dx</math>

The expected value operator (or expectation operator) E is linear in the sense that

E(aX + bY) = a EX + b EY
for any two random variables X and Y (which need to be defined on the same probability space) and any two real numbers a and b.

The expected values of the powers of X are called the moments of X; the moments about the mean of X are also defined as certain expected values.

In general, the expected value operator is not multiplicative, i.e. E(XY) is not necessarily equal to EX EY, except if X and Y are independent. The difference, in the general case, gives rise to the covariance and correlation

To empirically determine the expected value of a random variable, one repeatedly measures values of the variable and computes the arithmetic mean of the results.

Similarly, in computer science, the expected value of X is defined as

<math>\operatorname{\mathbb{E}}[X] = \sum_i iP(X = i)</math>

where X is an algorithm with different, weighted subroutines, and i is a particular algorithm path.



All Wikipedia text is available under the terms of the GNU Free Documentation License

 
  Search Encyclopedia

Search over one million articles, find something about almost anything!
 
 
  
  Featured Article
Bullying

... have inherently negative implications, it merely designated anyone who assumed power for any period of time without a legitimate basis of authority. The first to hav ...

 
 
 
This page was created in 50.6 ms