Markov's Inequality
3 minute read
In this section, we will understand Markov’s, Chebyshev’s Inequality & Chernoff Bound.
Markov’s Inequality:
It gives an upper bound for the probability that a non-negative random variable will NOT exceed, based on its expected value.
Note: It gives a very loose upper bound.
Read more about Expectation
What is the probability that the restaurant will serve more than 200 customers in the next hour?
Hence, a 25% chance of serving more than 200 customers.
What is the probability that a randomly selected student gets a score of 90 marks or more?
Hence, there is a 78% chance that a randomly selected student gets a score of 90 marks or more.
Chebyshev’s Inequality:
It states that the probability of a random variable deviating from its mean is small if its variance is small.
- It is a more powerful version of Markov’s Inequality.
- It uses variance of the distribution in addition to the expected value or mean.
- Also, it does NOT assume the random variable to be non-negative.
- It uses more information about the data i.e mean and variance.
Note: It gives a tighter upper bound than Markov’s Inequality.
What is the probability that a randomly selected student gets a score of 90 marks or more?
Given the standard deviation of the test score is 10 marks.
Given, the standard deviation of the test score \(\sigma\) = 10 marks.
=> Variance = \(\sigma^2\) = 100
Hence, Chebyshev’s Inequality gives a far tighter upper bound of 25% than Markov’s Inequality of 78%(approx).
Chernoff Bound:
It is an upper bound on the probability that a random variable deviates from its expected value.
It’s an exponentially decreasing bound on the “tail” of a random variable’s distribution,
which can be calculated using its moment generating function.
- It is used for sum or average of independent random variables (not necessarily identically distributed).
- It provides exponentially tighter bounds, better than Chebyshev’s Inequality’s quadratic decay.
- It uses all moments to capture the full shape of the distribution, using the moment generating function(MGF).
Proof:
For the sum of ’n’ independent random variables,
Note: Used to compute how far the sum of independent random variables deviate from their expected value.
Read more about Moment Generating Function
End of Section