7.2.4 Convergence in Distribution

Convergence in distribution is in some sense the weakest type of convergence. All it says is that the CDF of $X_n$'s converges to the CDF of $X$ as $n$ goes to infinity. It does not require any dependence between the $X_n$'s and $X$. We saw this type of convergence before when we discussed the central limit theorem. To say that $X_n$ converges in distribution to $X$, we write

\begin{align}%\label{eq:union-bound} X_n \ \xrightarrow{d}\ X . \end{align}

Here is a formal definition of convergence in distribution:

Convergence in Distribution

A sequence of random variables $X_1$, $X_2$, $X_3$, $\cdots$ converges in distribution to a random variable $X$, shown by $X_n \ \xrightarrow{d}\ X$, if \begin{align}%\label{eq:union-bound} \lim_{n \rightarrow \infty} F_{X_n}(x)=F_X(x), \end{align} for all $x$ at which $F_X(x)$ is continuous.


Example
If $X_1$, $X_2$, $X_3$, $\cdots$ is a sequence of i.i.d. random variables with CDF $F_X(x)$, then $X_n \ \xrightarrow{d}\ X$. This is because \begin{align}%\label{eq:union-bound} F_{X_n}(x)=F_X(x), \qquad \textrm{ for all }x. \end{align} Therefore, \begin{align}%\label{eq:union-bound} \lim_{n \rightarrow \infty} F_{X_n}(x)=F_X(x), \qquad \textrm{ for all }x. \end{align}

Example
Let $X_2$, $X_3$, $X_4$, $\cdots$ be a sequence of random variable such that \begin{equation} \nonumber F_{X_n}(x) = \left\{ \begin{array}{l l} 1-\left(1-\frac{1}{n}\right)^{nx} & \quad x > 0\\ & \quad \\ 0 & \quad \text{otherwise} \end{array} \right. \end{equation} Show that $X_n$ converges in distribution to $Exponential(1)$.
  • Solution
    • Let $X \sim Exponential(1)$. For $x \leq 0$, we have \begin{align}%\label{} F_{X_n}(x)=F_X(x)=0, \qquad \textrm{ for }n=2,3,4,\cdots. \end{align} For $x \geq 0$, we have \begin{align}%\label{eq:union-bound} \lim_{n \rightarrow \infty} F_{X_n}(x) &= \lim_{n \rightarrow \infty} \left( 1-\left(1-\frac{1}{n}\right)^{nx} \right)\\ &=1-\lim_{n \rightarrow \infty} \left(1-\frac{1}{n}\right)^{nx}\\ &=1-e^{-x}\\ &=F_X(x), \qquad \textrm{ for all }x. \end{align} Thus, we conclude that $X_n \ \xrightarrow{d}\ X$.


When working with integer-valued random variables, the following theorem is often useful.


Theorem Consider the sequence $X_1$, $X_2$, $X_3$, $\cdots$ and the random variable $X$. Assume that $X$ and $X_n$ (for all $n$) are non-negative and integer-valued, i.e., \begin{align}%\label{} &R_X \subset \{0,1,2,\cdots\}, \\ &R_{X_n} \subset \{0,1,2,\cdots\}, \qquad \textrm{ for }n=1,2,3,\cdots. \end{align} Then $X_n \ \xrightarrow{d}\ X$ if and only if \begin{align}%\label{eq:union-bound} \lim_{n \rightarrow \infty} P_{X_n}(k)=P_X(k), \qquad \textrm{ for }k=0,1,2,\cdots. \end{align}
  • Proof
    • Since $X$ is integer-valued, its CDF, $F_X(x)$, is continuous at all $x \in \mathbb{R}-\{0,1,2,...\}$. If $X_n \ \xrightarrow{d}\ X$, then \begin{align}%\label{eq:union-bound} \lim_{n \rightarrow \infty} F_{X_n}(x)=F_X(x), \qquad \textrm{ for all }x \in \mathbb{R}-\{0,1,2,...\}. \end{align} Thus, for $k=0,1,2,\cdots$, we have \begin{align}%\label{eq:union-bound} \lim_{n \rightarrow \infty} P_{X_n}(k)&=\lim_{n \rightarrow \infty} \left[F_{X_n}\left(k+\frac{1}{2}\right)- F_{X_n}\left(k-\frac{1}{2}\right)\right] \hspace{25pt} \textrm{ ($X_n$'s are integer-valued)}\\ &= \lim_{n \rightarrow \infty} F_{X_n}\left(k+\frac{1}{2}\right)- \lim_{n \rightarrow \infty} F_{X_n}\left(k-\frac{1}{2}\right)\\ &=F_{X}\left(k+\frac{1}{2}\right)- F_{X}\left(k-\frac{1}{2}\right) \hspace{30pt} \textrm{ (since $X_n \ \xrightarrow{d}\ X$)} \\ &=P_X(k) \hspace{30pt} \textrm{ (since $X$ is integer-valued).} \end{align} $\\ \\$ To prove the converse, assume that we know \begin{align}%\label{eq:union-bound} \lim_{n \rightarrow \infty} P_{X_n}(k)=P_X(k), \qquad \textrm{ for }k=0,1,2,\cdots. \end{align} Then, for all $x \in \mathbb{R}$, we have \begin{align}%\label{eq:union-bound} \lim_{n \rightarrow \infty} F_{X_n}(x)&= \lim_{n \rightarrow \infty} P(X_n \leq x)\\ &=\lim_{n \rightarrow \infty} \sum_{k=0}^{\lfloor x\rfloor} P_{X_n}(k), \end{align} where $\lfloor x\rfloor$ shows the largest integer less than or equal to $x$. Since for any fixed $x$, the set $\{0,1, \cdots, \lfloor x\rfloor\}$ is a finite set, we can change the order of the limit and the sum, so we obtain \begin{align}%\label{eq:union-bound} \lim_{n \rightarrow \infty} F_{X_n}(x)&= \sum_{k=0}^{\lfloor x\rfloor} \lim_{n \rightarrow \infty} P_{X_n}(k)\\ &= \sum_{k=0}^{\lfloor x\rfloor} P_{X}(k) \qquad \textrm{ (by assumption)}\\ &=P(X \leq x)=F_X(x). \end{align}


Example
Let $X_1$, $X_2$, $X_3$, $\cdots$ be a sequence of random variable such that \begin{align}%\label{eq:union-bound} X_n \sim Binomial\left(n, \frac{\lambda}{n}\right), \qquad \textrm{ for }n \in \mathbb{N}, n> \lambda, \end{align} where $\lambda>0$ is a constant. Show that $X_n$ converges in distribution to $Poisson(\lambda)$.
  • Solution
    • By Theorem 7.1, it suffices to show that \begin{align}%\label{eq:union-bound} \lim_{n \rightarrow \infty} P_{X_n}(k)=P_X(k), \qquad \textrm{ for all }k=0,1,2,\cdots. \end{align} We have \begin{align}%\label{} \nonumber \nonumber \lim_{n \rightarrow \infty} P_{X_n}(k) &= \lim_{n \rightarrow \infty} {n \choose k} \left(\frac{\lambda}{n}\right)^k \left(1-\frac{\lambda}{n}\right)^{n-k} \\ \nonumber &= \lambda^k \lim_{n \rightarrow \infty} {\frac{n!}{k! (n-k)!}} \left(\frac{1}{n^k}\right) \left(1-\frac{\lambda}{n}\right)^{n-k}\\ \nonumber &= \frac{\lambda^k}{k!} . \lim_{n \rightarrow \infty} \left(\left[ \frac{n(n-1)(n-2)...(n-k+1)}{n^k}\right] \left[ \left(1-\frac{\lambda}{n}\right)^{n}\right] \left[\left(1-\frac{\lambda}{n}\right)^{-k}\right]\right). \end{align} Note that for a fixed $k$, we have \begin{align}%\label{} \nonumber &\lim_{n \rightarrow \infty} \frac{n(n-1)(n-2)...(n-k+1)}{n^k} =1,\\ \nonumber &\lim_{n \rightarrow \infty} \left(1-\frac{\lambda}{n}\right)^{-k}=1,\\ \nonumber &\lim_{n \rightarrow \infty}\left(1-\frac{\lambda}{n}\right)^{n}=e^{-\lambda}. \end{align} Thus, we conclude \begin{equation}%\label{} \nonumber \lim_{n \rightarrow \infty} P_{X_n}(k)=\frac{e^{-\lambda} \lambda^k}{k!}. \end{equation}


We end this section by reminding you that the most famous example of convergence in distribution is the central limit theorem (CLT). The CLT states that the normalized average of i.i.d. random variables $X_1$, $X_2$, $X_3$, $\cdots$ converges in distribution to a standard normal random variable.




The print version of the book is available on Amazon.

Book Cover


Practical uncertainty: Useful Ideas in Decision-Making, Risk, Randomness, & AI

ractical Uncertaintly Cover