Concepts in Probability Theory: Mathematical Expectation (2007) – Article by G. Stolyarov II

Concepts in Probability Theory: Mathematical Expectation (2007) – Article by G. Stolyarov II

The New Renaissance Hat
G. Stolyarov II
July 17, 2014
Note from the Author: This article was originally published on Associated Content (subsequently, Yahoo! Voices) in 2007.  The article earned over 10,000 page views on Associated Content/Yahoo! Voices, and I seek to preserve it as a valuable resource for readers, subsequent to the imminent closure of Yahoo! Voices. Therefore, this essay is being published directly on The Rational Argumentator for the first time.  ***
~ G. Stolyarov II, July 17, 2014

The idea of expectation is crucial to probability theory and its applications. As one who has successfully passed actuarial Exam P on Probability, I would like to educate the general public about this interesting and useful mathematical concept.

The idea of expectation relies on some set of possible outcomes, each of which has a known probability and a known, quantifiable payoff — which can be positive or negative. Let us presume that we are playing a game called X with possible outcomes A, B, and C on a given turn. Each of these outcomes has a known probability P(A), P(B), and P(C) respectively. Each of the outcomes is associated with set payoffs a, b, and c, respectively. How much can one expect to win on an average turn of playing this game?

This is where the concept of expectation comes in. There is a P(A) probability of getting payoff a, a P(B) probability of getting payoff b, and a P(C) probability of getting payoff c. The expectation for a given turn of game X, E(X) is equal to the sum of the products of the probabilities for each given event and the payoffs for that event. So, in this case,

E(X) = a*P(A) + b*P(B) + c*P(C).

Now let us substitute some numbers to see how this concept could be applied. Let us say that event A has a probability of 0.45 of occurring, and if A occurs, you win $50. B has probability of 0.15 of occurring, and if B occurs, you lose $5. C has a probability of 0.4 of occurring, and if C occurs, you lose $60. Should you play this game? Let us find out.

E(X) = a*P(A) + b*P(B) + c*P(C). Substituting the values given above, we find that E(X) = 50*0.45 + (-5)(0.15) + (-60)(0.40) = -2.25. So, on an average turn of the game, you can be expected to lose about $2.25.

Note that this corresponds to neither of the three possible outcomes A, B, and C. But it does inform you of the kinds of results that you will approach if you play this game for a large number of turns. The Law of Large Numbers implies that the more times you play such a game, the more likely your average payoff per turn is to approach the expected value E(X). So if you play the game for 5 turns, you can be expected to lose 5*2.25 = $11.25, but you will likely experience some deviation from this in the real world. Yet if you play the game for 100 turns, you can be expected to lose 100*2.25 = $225, and your real-world outcome will most likely be quite close to this expected value.

In its more general form for some random variable X, the expectation of X or E(X) can be phrased as the sum of the products of all the possible outcomes x and their probabilities p(x). In mathematical notation, E(X) = sigma(x*p(x)) for all values of x. You can apply this formula to any discrete random variable, i. e., a random variable which assumes only a finite set of particular values.

For a continuous random variable Y, the mathematical expectation is equal to the integral of y*f(y) over the region on which the variable is defined. The function f(y) is called the probability density function of Y; its height over a given domain on a graph can be an indication the likelihood of the random variable assuming values over that domain.

Leave a Reply

Your email address will not be published. Required fields are marked *