Preliminary Course
Applied Mathematics
Probabilities. Expected value and variance
Expected value and variance
-
The mathematical expectation also called expected value of a discrete random
variable X taking values x1, x2, x3,... with probabilities
p1, p2, p3,... is defined as the weighted average of all its possible values,
the weights being the respective probabilities, i.e. the infinite sum:
- E[X] = ∑j=1,...,∞ pj xj
provided that this series absolutely converges (a necessary condition for the existence of the expected value of X).
Intuitively, it is the value that one would expect to get in average if the random process could be repeated an infinite number of times.
-
For the dice roll example, it comes:
- E[X] = ∑j=1,...,6 1/6 . j = 3.5
Based on that definition, one can introduce the notion of variance of a random variable:
-
VAR(X) = E[(X - E(X))2] = E[X2] - (E[X])2
-
For our case of discrete random variable, this writes, with the same notations as above:
-
VAR[X] = ∑j=1,...,∞ pj(xj - m)2
= ∑j=1,...,∞ pj xj2 - m2
where m is the expected value of X.
Variance measures how far the values taken by X are spread around its expected value.
The standard deviation is the square root of the variance.
-
For the dice example, we have:
-
VAR[X] = ∑j=1,...,6 1/6 (j - 3.5)2 = 2.92
Exercise
-
A variance of zero, VAR(X) = 0, indicates that:
Select the right answers and validate your choices