Mathematics Prelims

December 19, 2008

Expected Value of a Random Variable

Filed under: Measure Theory,Probability Theory — cjohnson @ 8:27 pm

If X : \Omega \to \mathbb{R} is a random variable on the probability space (\Omega, \mathcal{F}, P), then the expected value (aka expectation) of X is, in essence, a weighted average of of the values X may take on.  In the case of simple random variables this weighted average is precisely the expectation.  For a general random variable we’re actually taking a limit of the expectations of simple random variables that converge to the random variable of interest.

Assuming X is a non-negative simple random variable of the form

\displaystyle X(\omega) = \sum_{i=1}^n \alpha_i 1_{A_i}(\omega)

where A_i \in \mathcal{F}, and for the sake of simplicity that \alpha_1 < \alpha_2 < ... < \alpha_n.  The expected value of X, denoted \mathbb{E}[X], is

\displaystyle \mathbb{E}[X] = \sum_{i=1}^n \alpha_i P(A_i).

When X is not a simple random variable, we take \mathbb{E}[X] to be the supremum of the expectations of simple random variables less than X.  That is,

\displaystyle \mathbb{E}[X] = \sup \left\{ \mathbb{E}[Z] : 0 \leq Z \leq X \right\}

where each Z is a simple random variable.  When X is not non-negative, we break X into its positive and negative parts, X^{+} and X^{-}, and provided \mathbb{E}[X^{+}] and \mathbb{E}[X^{-}] are not both infinite, we take the expectation to be

\displaystyle \mathbb{E}[X] = \mathbb{E}[X^{+}] - \mathbb{E}[X^{-}].

Notice that our definition of \mathbb{E}[X] is simply the Lebesgue integral of X with respect to the probability measure P over all of \Omega.

\displaystyle \mathbb{E}[X] = \int_{\Omega} X dP

As such, useful properties of Lebesgue integrals, such as linearity, and powerful theorems like monotone and dominated convergence, hold for expectations.  One slightly less obvious nice property is that if F is the distribution function of X, then we have

\displaystyle \mathbb{E}[X] = \int_{\Omega} X dP = \int_{-\infty}^\infty x \, dF(x)

where the integral on the right refers to the Lebesgue-Stieltjes integral using the probability measure obtained from F, or simply the usual Riemann-Stieltjes integral in those cases that it exists.  To see this, simply show that the result holds for (non-negative) simple random variables, note that for a general non-negative random variable there exists an increasing sequence of non-negative simple random variables that converges to it, and apply the monotone convergence theorem.  For general random variables, break the function into its positive and negative parts and apply the previous result.


Leave a Comment »

No comments yet.

RSS feed for comments on this post. TrackBack URI

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s

Create a free website or blog at

%d bloggers like this: