The expectation or expected value of a random variable $ X$ is denoted by $ E[X]$. It can be considered as a kind of weighted average for $ X$, in which the weights are obtained from the probability distribution. If $ S$ is discrete, then

$\displaystyle E[X] = \sum_{s \in S} X(s) P(s) .$ (9.10)

If $ S$ is continuous, then9.2

$\displaystyle E[X] = \int_{S} X(s) p(s) ds.$ (9.11)

One can then define conditional expectation, which applies a given condition to the probability distribution. For example, if $ S$ is discrete and an event $ F$ is given, then

$\displaystyle E[X\vert F] = \sum_{s \in S} X(s) P(s\vert F) .$ (9.12)

Example 9..5 (Tossing Dice)   Returning to Example 9.4, the elements of $ S$ are already real numbers. Hence, a random variable $ X$ can be defined by simply letting $ X(s) = s$. Using (9.11), the expected value, $ E[X]$, is $ 3.5$. Note that the expected value is not necessarily a value that is ``expected'' in practice. It is impossible to actually obtain $ 3.5$, even though it is not contained in $ S$. Suppose that the expected value of $ X$ is desired only over trials that result in numbers greater then $ 3$. This can be described by the event $ F = \{4,5,6\}$. Using conditional expectation, (9.12), the expected value is $ E[X\vert F] = 5$.

Now consider tossing two dice in succession. Each element $ s \in S$ is expressed as $ s = (i,j)$ in which $ i,j \in \{1,2,3,4,5,6\}$. Since $ S \not \subset {\mathbb{R}}$, the random variable needs to be slightly more interesting. One common approach is to count the sum of the dice, which yields $ X(s) = i+j$ for any $ s \in S$. In this case, $ E[X] =
7$. $ \blacksquare$

Steven M LaValle 2012-04-20