This post is about mathematical concepts like expectation, linearity of expectation. It covers one of the required topics to understand Randomized Algorithms.
Let us consider the following simple problem.
Given a fair dice with 6 faces, the dice is thrown n times, find expected value of sum of all results.
For example, if n = 2, there are total 36 possible outcomes.
(1, 1), (1, 2), .... (1, 6) (2, 1), (2, 2), .... (2, 6) ................ ................ (6, 1), (6, 2), ..... (6, 6)
Expected value of a discrete random variable is R defined as following. Suppose R can take value r1 with probability p1, value r2 with probability p2, and so on, up to value rk with probability pk. Then the expectation of this random variable R is defined as
E[R] = r1*p1 + r2*p2 + ... rk*pk
Let us calculate expected value for the above example.
Expected Value of sum = 2*1/36 + 3*1/6 + .... + 7*1/36 + of two dice throws 3*1/36 + 4*1/6 + .... + 8*1/36 + ........................ ......................... 7*1/36 + 8*1/6 + .... + 12*1/36 = 7
The above way to solve the problem becomes difficult when there are more dice throws.
If we know about linearity of expectation, then we can quickly solve the above problem for any number of throws.
[ad type=”banner”]Linearity of Expectation: Let R1 and R2 be two discrete random variables on some probability space, then
E[R1 + R2] = E[R1] + E[R2]
Using the above formula, we can quickly solve the dice problem.
Expected Value of sum of 2 dice throws = 2*(Expected value of one dice throw) = 2*(1/6 + 2/6 + .... 6/6) = 2*7/2 = 7 Expected value of sum for n dice throws is = n * 7/2 = 3.5 * n
Some interesting facts about Linearly of Expectation:
1) Linearity of expectation holds for both dependent and independent events. On the other hand the rule E[R1R2] = E[R1]*E[R2] is true only for independent events.
2) Linearity of expectation holds for any number of random variables on some probability space. Let R1, R2, R3, … Rk be k random variables, then
E[R1 + R2 + R3 + … + Rk] = E[R1] + E[R2] + E[R3] + … + E[Rk]
[ad type=”banner”]
Another example that can be easily solved with linearity of expectation:
Hat-Check Problem: Let there be group of n men where every man has one hat. The hats are redistributed and every man gets a random hat back. What is the expected number of men that get their original hat back.
Solution: Let Ri be a random variable, the value of random variable is 1 if i’th man gets the same hat back, otherwise 0.
So the expected number of men to get the right hat back is = E[R1] + E[R2] + .. + E[Rn] = P(R1 = 1) + P(R2 = 1) + .... + P(Rn = 1) [Here P(Ri = 1) indicates probability that Ri is 1] = 1/n + 1/n + ... + 1/n = 1
So on average 1 person gets the right hat back.
Exercise:
1) Given a fair coin, what is the expected number of heads when coin is tossed n times.
2) Balls and Bins: Suppose we have m balls, labeled i = 1, … , m and n bins, labeled j = 1, .. ,n. Each ball is thrown into one of the bin independently and uniformly at random.
a) What is the expected number of balls in every bin
b) What is the expected number of empty bins.
3) Coupon Collector: Suppose there are n types of coupons in a lottery and each lot contains one coupon (with probability 1 = n each). How many lots have to be bought (in expectation) until we have at least one coupon of each type.
See following for solution of Coupon Collector.
Expected Number of Trials until Success
Linearity of expectation is useful in algorithms. For example, expected time complexity of random algorithms like randomized quick sort is evaluated using linearity of expectation (See this for reference).
[ad type=”banner”]