Probability with Martingales by David Williams\[ \newcommand{\Q}{\mathbb Q} \newcommand{\R}{\mathbb R} \newcommand{\C}{\mathbb C} \newcommand{\Z}{\mathbb Z} \newcommand{\N}{\mathbb N} \newcommand{\abs}[1]{\lvert #1 \rvert} \newcommand{\norm}[1]{\lVert #1 \rVert} \newcommand{\abs}[1]{\lvert #1 \rvert} \newcommand{\Norm}[1]{\left \lVert #1 \right \rVert} \newcommand{\Abs}[1]{\left \lvert #1 \right \rvert} \newcommand{\pind}{\bot \!\!\! \bot} \newcommand{\probto}{\buildrel P\over \to} \newcommand{\vect}[1]{\boldsymbol #1} \DeclareMathOperator{\EE}{\mathbb E} \DeclareMathOperator{\PP}{\mathbb P} \DeclareMathOperator{\E}{E} \DeclareMathOperator{\dnorm}{\mathcal N} \DeclareMathOperator{\sgn}{sgn} \DeclareMathOperator{\Var}{Var} \DeclareMathOperator{\Cov}{Cov} \DeclareMathOperator{\Leb}{Leb} \DeclareMathOperator{\Bin}{Bin} \newcommand{\wto}{\buildrel w\over \to} \] Modes of ConvergenceProblem A13.1
Modes of convergence.
Suppose there is an event \(E\subset \Omega\) with \(\Pr(E)>0\) and for some \(\epsilon>0\), \(\abs{X_n-X}>\epsilon\) infinitely often for \(\omega \in \Omega\). Then for every \(\omega \in E\), it can't be that \(X_n\to X\) since either \(X_n > X+\epsilon\) infinitely often or \(\limsup_n X_n< X-\epsilon\) infinitely often, and hence either the \(\liminf\) or \(\limsup\) is different from \(X\). Thus \(\Pr( X_n\to X)\leq 1-\Pr(E) < 1\).
\[ \begin{equation} X_{n,k}(x) = \begin{cases} 1& \text{if } \frac k n \leq x < \frac{k+1} n\\ 0& \text{otherwise} \end{cases} \end{equation} \] Then take the \(X_{n,k}\) as a sequence taking the indices in lexicographic order. That is, take the \(n\)'s in order, and for a given \(n\) take the \(k\)'s in order. Now for \(n\geq N\), and \(\epsilon \in (0,1)\) \[ \begin{equation} \Pr( \abs{X_{n,k}-0}> \epsilon ) = \Pr( X_{n,k}=1) = \frac 1 n \leq \frac 1 N \end{equation} \] Thus \(X_{n,k} \to 0\) in probability. On the other hand, \(\liminf X_{n,k}(x) = 1\) for any \(x\in [0,1)\). That's because for any \(n\), let \(k_n(x) = \lfloor n/x \rfloor\). Then \(X_{n,k_n(x)}(x) = 1\), and \(X_{n,k}(x)=1\) infinitel often. Therefore it can't be that \(X_{n,k}\to X\) almost sure (in fact, almost surely, the sequence doesn't converge). A second solution is given by considering \(I_{E_n}\) for independent \(E_n\) where \(\Pr(E_n) = p_n\). Note \[ \begin{equation} \Pr(\abs{I_{E_m}-I_{E_n}}>\epsilon) = \Pr( E_n\Delta E_m) = (1-p_n)p_m + p_n(1-p_m) \end{equation} \] which are independent. TODO finish second solution
Now recursively choose a subsequence \(n_{m,k}\) of \(n_{m-1,k}\) which satisfies \(\Pr(\abs{X_{n_{m,k}}-X}>1/m) < 2^{-k}\). This subsequence satisfies the property \(\sum \Pr( \abs{X_{n_{1,k}} - X} > 1/m) = 1 <\infty\). Therefore, for any \(m'\in \N\) the diagonal subsquence \(n_m = n_{m,m}\) is entirely within the subsequence \(n_{m’,k}\) except for at most finitely many terms when \(m<m'\). Thus \(\sum \Pr( \abs{X_{n_m}-X} > 1/m’) < \infty\) for all \(m'\). This shows that the subsequence satisfies property (c), and therefore \(X_{n_m} \to X\) almost surely.
Problem A13.2
If \(\xi \sim \dnorm(0,1)\), show \(\E e^{\lambda \xi}= \exp( -\frac 1 2 \lambda^2)\). Suppose \(\xi_1,\xi_2,\dots\) are i.i.d. RV's each with a \(\dnorm(0,1)\) distribution. Let \(S_n = \sum_{i=1}^n \xi_i\) and let \(a,b\in \R\) and define \[ \begin{equation} X_n = \exp(a S_n - bn) \end{equation} \] Show \[ \begin{equation} (X_n \to 0 \text{ a.s.}) \Leftrightarrow (b>0) \end{equation} \] but that for \(r\geq 1\) \[ \begin{equation} (X_n\to 0 \text{ in } \mathcal L^r) \Leftrightarrow (r<2b/a^2) \end{equation} \] Calculating \[ \begin{equation} \begin{split} \E e^{\lambda \xi} &= \frac 1 {\sqrt{2\pi}}\int_{\R} e^{-\lambda \xi} e^{-\frac 1 2 \xi^2} \, d\xi =\frac 1 {\sqrt{2\pi}} \int_{\R} e^{-\frac 1 2 (\xi - \lambda)^2 - \frac 1 2 \lambda^2} \, d\xi = e^{-\frac 1 2 \lambda^2} \frac 1 {\sqrt{2\pi}} \int_{\R} e^{-\frac 1 2 \zeta^2} \, d\zeta \\ &= e^{-\frac 1 2 \lambda^2} \end{split} \end{equation} \] where we performed the substitution \(\zeta = \xi-\lambda\) end{solution} Suppose \(b<0\). By the SLLN, \(S_n/n \to 0\) as \(n\to \infty\) so, almost surely. Thus for each element of the sample space \(\omega\) there is an \(N(\omega)\) such that \(\frac {S_n} n-b < \frac 1 2 b\) for all \(n>N\). In this case \(\exp(S_n - bn) < \exp(-\frac 1 2 bn)\to 0\). Now suppose \(b > 0\). The similarly there is an \(N(\omega)\) such that \(\frac {S_n} n > \frac 1 2 b\) and therefore \(X_n > \exp(\frac 1 2 bn) \to \infty\). For \(b=0\), the law of the iterated logarithm says that \(S_n > n \log \log n\) infinitely often. Therefore \(e^{a S_n} > (\log n)^{a n}\) infinitely often, and hence does not converge to 0. Since \(-S_n\) is also the sum of i.i.d. \(\dnorm(0,1)\) RV's, \(e^{a S_n} < \frac{1}{(\log n)^{an}}\) infinitely often. Thus the limit of \(e^{a S_n}\) doesn't exist, it oscillates infinitely often reaching arbitrarily high and low positive values. On the other hand \[ \begin{equation} \E X_n^r = \E e^{r(aS_n-bn)} = e^{-rbn} (\E e^{ra \xi})^n = e^{(\frac 1 2 (ra)^2-br)n} \end{equation} \] If \(r<2b/a^2\) then \(\norm{X_n}_r \to 0\), but if \(r\geq 2b/a^2\) its not true. If \(2b/a^2\) then \(\norm{X_n}_r \to \infty\) and if \(r=2b/a^2\) then \(\norm{X_n}_r \to 1\) ContactFor comments or corrections please contact Ryan McCorvie at ryan@martingale.group |