Product of Convergence in Probability Continuous Map



7.2.5 Convergence in Probability

Convergence in probability is stronger than convergence in distribution. In particular, for a sequence $X_1$, $X_2$, $X_3$, $\cdots$ to converge to a random variable $X$, we must have that $P(|X_n-X| \geq \epsilon)$ goes to $0$ as $n\rightarrow \infty$, for any $\epsilon > 0$. To say that $X_n$ converges in probability to $X$, we write

\begin{align}%\label{eq:union-bound} X_n \ \xrightarrow{p}\ X . \end{align} Here is the formal definition of convergence in probability:

Convergence in Probability

A sequence of random variables $X_1$, $X_2$, $X_3$, $\cdots$ converges in probability to a random variable $X$, shown by $ X_n \ \xrightarrow{p}\ X$, if \begin{align}%\label{eq:union-bound} \lim_{n \rightarrow \infty} P\big(|X_n-X| \geq \epsilon \big)=0, \qquad \textrm{ for all }\epsilon>0. \end{align}


Example

Let $X_n \sim Exponential(n)$, show that $ X_n \ \xrightarrow{p}\ 0$. That is, the sequence $X_1$, $X_2$, $X_3$, $\cdots$ converges in probability to the zero random variable $X$.

  • Solution
    • We have \begin{align}%\label{eq:union-bound} \lim_{n \rightarrow \infty} P\big(|X_n-0| \geq \epsilon \big) &=\lim_{n \rightarrow \infty} P\big(X_n \geq \epsilon \big) & (\textrm{ since $X_n\geq 0$ })\\ &=\lim_{n \rightarrow \infty} e^{-n\epsilon} & (\textrm{ since $X_n \sim Exponential(n)$ })\\ &=0 , \qquad \textrm{ for all }\epsilon>0. \end{align}

Example

Let $X$ be a random variable, and $X_n=X+Y_n$, where \begin{align}%\label{} EY_n=\frac{1}{n}, \qquad \mathrm{Var}(Y_n)=\frac{\sigma^2}{n}, \end{align} where $\sigma>0$ is a constant. Show that $X_n \ \xrightarrow{p}\ X$.

  • Solution
    • First note that by the triangle inequality, for all $a,b \in \mathbb{R}$, we have $|a+b| \leq |a|+|b|$. Choosing $a=Y_n-EY_n$ and $b=EY_n$, we obtain \begin{align}%\label{eq:union-bound} |Y_n| \leq \left|Y_n-EY_n\right|+\frac{1}{n}. \end{align} Now, for any $\epsilon>0$, we have \begin{align}%\label{eq:union-bound} P\big(|X_n-X| \geq \epsilon \big)&=P\big(|Y_n| \geq \epsilon \big)\\ & \leq P\left(\left|Y_n-EY_n\right|+\frac{1}{n} \geq \epsilon \right)\\ & = P\left(\left|Y_n-EY_n\right|\geq \epsilon-\frac{1}{n} \right)\\ & \leq \frac{\mathrm{Var}(Y_n)}{\left(\epsilon-\frac{1}{n} \right)^2} &\textrm{(by Chebyshev's inequality)}\\ &= \frac{\sigma^2}{n \left(\epsilon-\frac{1}{n} \right)^2}\rightarrow 0 \qquad \textrm{ as } n\rightarrow \infty. \end{align} Therefore, we conclude $X_n \ \xrightarrow{p}\ X$.

As we mentioned previously, convergence in probability is stronger than convergence in distribution. That is, if $X_n \ \xrightarrow{p}\ X$, then $X_n \ \xrightarrow{d}\ X$. The converse is not necessarily true. For example, let $X_1$, $X_2$, $X_3$, $\cdots$ be a sequence of i.i.d. $Bernoulli\left(\frac{1}{2}\right)$ random variables. Let also $X \sim Bernoulli\left(\frac{1}{2}\right)$ be independent from the $X_i$'s. Then, $X_n \ \xrightarrow{d}\ X$. However, $X_n$ does not converge in probability to $X$, since $|X_n-X|$ is in fact also a $Bernoulli\left(\frac{1}{2}\right)$ random variable and

\begin{align}%\label{eq:union-bound} P\big(|X_n-X| \geq \epsilon \big)&=\frac{1}{2}, \qquad \textrm{ for } 0<\epsilon<1. \end{align}

A special case in which the converse is true is when $X_n \ \xrightarrow{d}\ c$, where $c$ is a constant. In this case, convergence in distribution implies convergence in probability. We can state the following theorem:

Theorem If $X_n \ \xrightarrow{d}\ c$, where $c$ is a constant, then $X_n \ \xrightarrow{p}\ c$.

  • Proof
    • Since $X_n \ \xrightarrow{d}\ c$, we conclude that for any $\epsilon>0$, we have \begin{align}%\label{eq:union-bound} \lim_{n \rightarrow \infty} F_{X_n}(c-\epsilon)=0,\\ \lim_{n \rightarrow \infty} F_{X_n}(c+\frac{\epsilon}{2})=1. \end{align} We can write for any $\epsilon>0$, \begin{align}%\label{eq:union-bound} \lim_{n \rightarrow \infty} P\big(|X_n-c| \geq \epsilon \big) &= \lim_{n \rightarrow \infty} \bigg[P\big(X_n \leq c-\epsilon \big) + P\big(X_n \geq c+\epsilon \big)\bigg]\\ &=\lim_{n \rightarrow \infty} P\big(X_n \leq c-\epsilon \big) + \lim_{n \rightarrow \infty} P\big(X_n \geq c+\epsilon \big)\\ &=\lim_{n \rightarrow \infty} F_{X_n}(c-\epsilon) + \lim_{n \rightarrow \infty} P\big(X_n \geq c+\epsilon \big)\\ &= 0 + \lim_{n \rightarrow \infty} P\big(X_n \geq c+\epsilon \big) \hspace{50pt} (\textrm{since } \lim_{n \rightarrow \infty} F_{X_n}(c-\epsilon)=0)\\ &\leq \lim_{n \rightarrow \infty} P\big(X_n > c+\frac{\epsilon}{2} \big)\\ &= 1-\lim_{n \rightarrow \infty} F_{X_n}(c+\frac{\epsilon}{2})\\ &=0 \hspace{140pt} (\textrm{since } \lim_{n \rightarrow \infty} F_{X_n}(c+\frac{\epsilon}{2})=1). \end{align} Since $\lim \limits_{n \rightarrow \infty} P\big(|X_n-c| \geq \epsilon \big) \geq 0$, we conclude that \begin{align}%\label{eq:union-bound} \lim_{n \rightarrow \infty} P\big(|X_n-c| \geq \epsilon \big)&= 0, \qquad \textrm{ for all }\epsilon>0, \end{align} which means $X_n \ \xrightarrow{p}\ c$.

The most famous example of convergence in probability is the weak law of large numbers (WLLN). We proved WLLN in Section 7.1.1. The WLLN states that if $X_1$, $X_2$, $X_3$, $\cdots$ are i.i.d. random variables with mean $EX_i=\mu<\infty$, then the average sequence defined by

\begin{align}%\label{} \overline{X}_n=\frac{X_1+X_2+...+X_n}{n} \end{align}

converges in probability to $\mu$. It is called the "weak" law because it refers to convergence in probability. There is another version of the law of large numbers that is called the strong law of large numbers (SLLN). We will discuss SLLN in Section 7.2.7.


The print version of the book is available through Amazon here.

Book Cover

penaclaid1966.blogspot.com

Source: https://www.probabilitycourse.com/chapter7/7_2_5_convergence_in_probability.php

0 Response to "Product of Convergence in Probability Continuous Map"

Post a Comment

Iklan Atas Artikel

Iklan Tengah Artikel 1

Iklan Tengah Artikel 2

Iklan Bawah Artikel