Convergence
Convergence of Random Variables
In this section, we will understand Convergence of Random Variables.
We will only discuss the below two types of convergence:
- Convergence in Probability
- Almost Sure Convergence
📘
Convergence in Probability:
A sequence of random variables \(X_1, X_1, \dots, X_n\) is said to converge in probability
to a known random variable \(X\),
if for every number \(\epsilon >0 \), the following is true:
\[
\lim_{n\rightarrow\infty} P(|X_n - X| > \epsilon) = 0, \forall \epsilon >0
\]
where,
\(X_n\): is the estimator or sample based random variable.
\(X\): is the known or limiting or target random variable.
\(\epsilon\): is the tolerance level or margin of error.
For example:
- Toss a fair coin:
Estimator:
\[
X_n = \begin{cases}
\frac{n}{n+1} & \text{, if Head } \\
\frac{1}{n} & \text{, if Tail} \\
\end{cases}
\]
Known random variable (Bernoulli):
\[
X = \begin{cases}
1 & \text{, if Head } \\
0 & \text{, if Tail} \\
\end{cases}
\]
\[
X_n - X = \begin{cases}
\frac{n}{n+1} - 1 = \frac{-1}{n+1} & \text{, if Head } \\
\frac{1}{n} - 0 = \frac{1}{n} & \text{, if Tail} \\
\end{cases}
\]\[
|X_n - X| = \begin{cases}
\frac{1}{n+1} & \text{, if Head } \\
\frac{1}{n} & \text{, if Tail} \\
\end{cases}
\]
Say, tolerance level \(\epsilon = 0.1\).
Then,
\[
\lim_{n\rightarrow\infty} P(|X_n - X| > \epsilon) = ?
\]
If n=5;
\[
|X_n - X| = \begin{cases}
\frac{1}{n+1} = \frac{1}{6} \approx 0.16 & \text{, if Head } \\
\frac{1}{n} = \frac{1}{5} = 0.2 & \text{, if Tail} \\
\end{cases}
\]
So, if n=5, then \(|X_n - X| > (\epsilon = 0.1)\).
=> \(P(|X_n - X| \ge (\epsilon=0.1)) = 1\).
if n=20;
\[
|X_n - X| = \begin{cases}
\frac{1}{n+1} = \frac{1}{21} \approx 0.04 & \text{, if Head } \\
\frac{1}{n} = \frac{1}{20} = 0.05 & \text{, if Tail} \\
\end{cases}
\]
So, if n=20, then \(|X_n - X| < (\epsilon=0.1)\).
=> \(P(|X_n - X| \ge (\epsilon=0.1)) = 0\).
=> \(P(|X_n - X| \ge (\epsilon=0.1)) = 0 ~\forall ~n \ge 10\)
Therefore,
\[
\lim_{n\rightarrow\infty} P(|X_n - X| \ge (\epsilon=0.1)) = 0
\]
Similarly, we can prove that if \(\epsilon = 0.01\), then the probability will be equal to 0 for \( n\ge 100 \).
\[
\lim_{n\rightarrow\infty} P(|X_n - X| \ge (\epsilon=0.01)) = 0
\]
Note: Task is to check whether the sequence of randome variables \(X_1, X_2, \dots, X_n\) converges in probability
to a known random variable \(X\) as \(n\rightarrow\infty\).
So, we can conclude that, if \(n > \frac{1}{\epsilon}\), then:
\[
\lim_{n\rightarrow\infty} P(|X_n - X| \ge \epsilon) = 0, \forall ~ \epsilon >0 \\[10pt]
\text{Converges in Probability } \\[10pt]
=> X_n \xrightarrow{Probability} X
\]
📘
Almost Sure Convergence:
A sequence of random variables \(X_1, X_2, \dots, X_n\) is said to almost surely converge to a known random variable \(X\),
for \(n \ge 1\), if the following is true:
\[
P(\lim_{n\rightarrow\infty} X_n = X) = 1 \\[10pt]
\text{Almost Sure or With Probability = 1 } \\[10pt]
=> X_n \xrightarrow{Almost ~ Sure} X
\]
where,
\(X_n\): is the estimator or sample based random variable.
\(X\): is the known or limiting or target random variable.
If, \(X_n \xrightarrow{Almost ~ Sure} X \), => \(X_n \xrightarrow{Probability} X \)
But, converse is NOT true.
Note: Almost Sure convergence is hardest to satisfy amongst all convergence, such as, convergence in probability,
convergence in distribution, etc.=
For example:
- \(X\) is random variable such that \(X = \frac{1}{2} \), a constant, i.e \(X_1 = X_2 = \dots = X_n = \frac{1}{2}\).
\(Y_1, Y_2,\dots ,Y_n \) are another sequence of random variables, such that :
\[
Y_1 = X_1 \\[10pt]
Y_2 = \frac{X_1 + X_2}{2} \\[10pt]
Y_3 = \frac{X_1 + X_2 + X_3}{3} \\[10pt]
\dots \\
Y_n = \frac{1}{n} \sum_{i=1}^{n} X_i \xrightarrow{Almost ~ Sure} \frac{1}{2}
\]
End of Section