Probability with Martingales by David Williams

\[ \newcommand{\Q}{\mathbb Q} \newcommand{\R}{\mathbb R} \newcommand{\C}{\mathbb C} \newcommand{\Z}{\mathbb Z} \newcommand{\N}{\mathbb N} \newcommand{\abs}[1]{\lvert #1 \rvert} \newcommand{\norm}[1]{\lVert #1 \rVert} \newcommand{\abs}[1]{\lvert #1 \rvert} \newcommand{\Norm}[1]{\left \lVert #1 \right \rVert} \newcommand{\Abs}[1]{\left \lvert #1 \right \rvert} \newcommand{\pind}{\bot \!\!\! \bot} \newcommand{\probto}{\buildrel P\over \to} \newcommand{\vect}[1]{\boldsymbol #1} \DeclareMathOperator{\EE}{\mathbb E} \DeclareMathOperator{\PP}{\mathbb P} \DeclareMathOperator{\E}{E} \DeclareMathOperator{\dnorm}{\mathcal N} \DeclareMathOperator{\sgn}{sgn} \DeclareMathOperator{\Var}{Var} \DeclareMathOperator{\Cov}{Cov} \DeclareMathOperator{\Leb}{Leb} \DeclareMathOperator{\Bin}{Bin} \newcommand{\wto}{\buildrel w\over \to} \]

Modes of Convergence

Problem A13.1

Modes of convergence.

  • Show that \((X_n\to X, \text{ a.s.}) \Rightarrow (X_n\to X \text{ in prob})\)

  • Prove that \((X_n\to X \text{ in prob}) \not \Rightarrow (X_n \to X, \text{ a.s.})\)

  • Prove that if \(\sum \Pr( \abs{X_n-X}>\epsilon)<\infty, \forall \epsilon>0\) then \(X_n\to X\), a.s.

  • Suppose that \(X_n\to X\) in probability. Prove that a subsequence \((X_{n_k})\) of \((X_n)\) converges \(X_{n_k}\to X\), a.s.

  • Deduce from (a) and (d) that \(X_n\to X\) in probability if and only if every subsequence of \((X_n)\) contains a further subsequence which converges a.s. to \(X\)

Suppose there is an event \(E\subset \Omega\) with \(\Pr(E)>0\) and for some \(\epsilon>0\), \(\abs{X_n-X}>\epsilon\) infinitely often for \(\omega \in \Omega\). Then for every \(\omega \in E\), it can't be that \(X_n\to X\) since either \(X_n > X+\epsilon\) infinitely often or \(\limsup_n X_n< X-\epsilon\) infinitely often, and hence either the \(\liminf\) or \(\limsup\) is different from \(X\). Thus \(\Pr( X_n\to X)\leq 1-\Pr(E) < 1\).

  • Let \(\Omega = [0,1]\), taking \(\Pr\) as the Lesbegue measure on the Borel sets. Let

\[ \begin{equation} X_{n,k}(x) = \begin{cases} 1& \text{if } \frac k n \leq x < \frac{k+1} n\\ 0& \text{otherwise} \end{cases} \end{equation} \]

Then take the \(X_{n,k}\) as a sequence taking the indices in lexicographic order. That is, take the \(n\)'s in order, and for a given \(n\) take the \(k\)'s in order. Now for \(n\geq N\), and \(\epsilon \in (0,1)\)

\[ \begin{equation} \Pr( \abs{X_{n,k}-0}> \epsilon ) = \Pr( X_{n,k}=1) = \frac 1 n \leq \frac 1 N \end{equation} \]

Thus \(X_{n,k} \to 0\) in probability. On the other hand, \(\liminf X_{n,k}(x) = 1\) for any \(x\in [0,1)\). That's because for any \(n\), let \(k_n(x) = \lfloor n/x \rfloor\). Then \(X_{n,k_n(x)}(x) = 1\), and \(X_{n,k}(x)=1\) infinitel often. Therefore it can't be that \(X_{n,k}\to X\) almost sure (in fact, almost surely, the sequence doesn't converge).

A second solution is given by considering \(I_{E_n}\) for independent \(E_n\) where \(\Pr(E_n) = p_n\). Note

\[ \begin{equation} \Pr(\abs{I_{E_m}-I_{E_n}}>\epsilon) = \Pr( E_n\Delta E_m) = (1-p_n)p_m + p_n(1-p_m) \end{equation} \]

which are independent. TODO finish second solution

  • By Borel-Cantelli, \(\abs{X_n - X}\leq \epsilon\) eventually almost surely, because only finitely of the events \(\{\abs{X_n-X}>\epsilon\}\) hold. By hypothesis, this is true for each \(\epsilon= 1/k\) for \(k\in \N\), so take the intersection of these countably many almost sure sets \(E_k\) to get an almost sure set \(E=\bigcap_k E_k\). For samples \(\omega \in E\), for each \(k\in \N\), there exists an \(N_k(\omega) \in \N\) such that if \(n>N_k(\omega)\) then \(\abs{X_n(\omega)-X(\omega)}\leq 1/k\). For arbitrary \(\epsilon>0\), choose \(1/k<\epsilon\) and \(N_\epsilon(\omega)=N_k(\omega)\) to get a corresponding statement for \(\epsilon\). This is the definition of almost sure convergence– for each \(\epsilon\) almost surely there exists an \(N\) such that \(\abs{X_n-X}<\epsilon\) for \(n>N\).

  • Start with \(m=1\). Since \(X_n\to X\) in probability, we can choose a subsequence \(n_{1,k}\) such that \(\Pr(\abs{X_{n_k}-X}>1)<2^{-k}\) for \(k\in \N\). This subsequence evidently satisfies the property \(\sum \Pr( \abs{X_{n_{1,k}} - X} > 1) = 1 <\infty\).

Now recursively choose a subsequence \(n_{m,k}\) of \(n_{m-1,k}\) which satisfies \(\Pr(\abs{X_{n_{m,k}}-X}>1/m) < 2^{-k}\). This subsequence satisfies the property \(\sum \Pr( \abs{X_{n_{1,k}} - X} > 1/m) = 1 <\infty\). Therefore, for any \(m'\in \N\) the diagonal subsquence \(n_m = n_{m,m}\) is entirely within the subsequence \(n_{m’,k}\) except for at most finitely many terms when \(m<m'\). Thus \(\sum \Pr( \abs{X_{n_m}-X} > 1/m’) < \infty\) for all \(m'\). This shows that the subsequence satisfies property (c), and therefore \(X_{n_m} \to X\) almost surely.

  • If \(X_n\to X\) in probability, then for any \(\epsilon>0\), \(p_n=\Pr(\abs{X_n-X})\to 0\). Since the sequence \(p_n\) has a limit, any subsequence \(p_{n_k}\) converges to the same limit, 0. Thus the subsequence \(X_{n_k}\) also converges in probability to \(X\). By (d) this subsequence has a further subsequence which converges almost surely to \(X\). Conversely, assume the property, and choose \(\epsilon>0\). We'll show \(p_n=\Pr(\abs{X_n-X}>\epsilon)\) has the property \(p_n\to 0\). Choose \(\delta >0\), \(n_1=1\), and recursively find a subsequence \(n_k\) by choosing \(p_{n_k} +\delta > \sup_{m>n_{k-1}} p_m\). By the property, \(p_{n_k}\) has a subsequence \(n_l\) for which \(X_{n_l}\) converges almost surely to \(X\). By (a), \(X_{n_l}\) converges in in probability, so \(p_{n_l} \to 0\). Since \(n_l\) is a subsequence of \(n_k\) is has the property that \(p_{n_l}+\delta > p_m\) for any \(m>n_l\). Taking the limit \(l\to \infty\), this shows that \(\lim p_m < \delta\). Since \(\delta\) is arbitrary, this shows \(p_m\to 0\). Since \(\epsilon\) is arbitrary, this shows \(X_n\to X\) in probability.

Problem A13.2

If \(\xi \sim \dnorm(0,1)\), show \(\E e^{\lambda \xi}= \exp( -\frac 1 2 \lambda^2)\). Suppose \(\xi_1,\xi_2,\dots\) are i.i.d. RV's each with a \(\dnorm(0,1)\) distribution. Let \(S_n = \sum_{i=1}^n \xi_i\) and let \(a,b\in \R\) and define

\[ \begin{equation} X_n = \exp(a S_n - bn) \end{equation} \]

Show

\[ \begin{equation} (X_n \to 0 \text{ a.s.}) \Leftrightarrow (b>0) \end{equation} \]

but that for \(r\geq 1\)

\[ \begin{equation} (X_n\to 0 \text{ in } \mathcal L^r) \Leftrightarrow (r<2b/a^2) \end{equation} \]

Calculating

\[ \begin{equation} \begin{split} \E e^{\lambda \xi} &= \frac 1 {\sqrt{2\pi}}\int_{\R} e^{-\lambda \xi} e^{-\frac 1 2 \xi^2} \, d\xi =\frac 1 {\sqrt{2\pi}} \int_{\R} e^{-\frac 1 2 (\xi - \lambda)^2 - \frac 1 2 \lambda^2} \, d\xi = e^{-\frac 1 2 \lambda^2} \frac 1 {\sqrt{2\pi}} \int_{\R} e^{-\frac 1 2 \zeta^2} \, d\zeta \\ &= e^{-\frac 1 2 \lambda^2} \end{split} \end{equation} \]

where we performed the substitution \(\zeta = \xi-\lambda\) end{solution}

Suppose \(b<0\). By the SLLN, \(S_n/n \to 0\) as \(n\to \infty\) so, almost surely. Thus for each element of the sample space \(\omega\) there is an \(N(\omega)\) such that \(\frac {S_n} n-b < \frac 1 2 b\) for all \(n>N\). In this case \(\exp(S_n - bn) < \exp(-\frac 1 2 bn)\to 0\). Now suppose \(b > 0\). The similarly there is an \(N(\omega)\) such that \(\frac {S_n} n > \frac 1 2 b\) and therefore \(X_n > \exp(\frac 1 2 bn) \to \infty\). For \(b=0\), the law of the iterated logarithm says that \(S_n > n \log \log n\) infinitely often. Therefore \(e^{a S_n} > (\log n)^{a n}\) infinitely often, and hence does not converge to 0. Since \(-S_n\) is also the sum of i.i.d. \(\dnorm(0,1)\) RV's, \(e^{a S_n} < \frac{1}{(\log n)^{an}}\) infinitely often. Thus the limit of \(e^{a S_n}\) doesn't exist, it oscillates infinitely often reaching arbitrarily high and low positive values.

On the other hand

\[ \begin{equation} \E X_n^r = \E e^{r(aS_n-bn)} = e^{-rbn} (\E e^{ra \xi})^n = e^{(\frac 1 2 (ra)^2-br)n} \end{equation} \]

If \(r<2b/a^2\) then \(\norm{X_n}_r \to 0\), but if \(r\geq 2b/a^2\) its not true. If \(2b/a^2\) then \(\norm{X_n}_r \to \infty\) and if \(r=2b/a^2\) then \(\norm{X_n}_r \to 1\)

Contact

For comments or corrections please contact Ryan McCorvie at ryan@martingale.group