Probability with Martingales by David Williams

\[ \newcommand{\Q}{\mathbb Q} \newcommand{\R}{\mathbb R} \newcommand{\C}{\mathbb C} \newcommand{\Z}{\mathbb Z} \newcommand{\N}{\mathbb N} \newcommand{\abs}[1]{\lvert #1 \rvert} \newcommand{\norm}[1]{\lVert #1 \rVert} \newcommand{\abs}[1]{\lvert #1 \rvert} \newcommand{\Norm}[1]{\left \lVert #1 \right \rVert} \newcommand{\Abs}[1]{\left \lvert #1 \right \rvert} \newcommand{\pind}{\bot \!\!\! \bot} \newcommand{\probto}{\buildrel P\over \to} \newcommand{\vect}[1]{\boldsymbol #1} \DeclareMathOperator{\EE}{\mathbb E} \DeclareMathOperator{\PP}{\mathbb P} \DeclareMathOperator{\E}{E} \DeclareMathOperator{\dnorm}{\mathcal N} \DeclareMathOperator{\sgn}{sgn} \DeclareMathOperator{\Var}{Var} \DeclareMathOperator{\Cov}{Cov} \DeclareMathOperator{\Leb}{Leb} \DeclareMathOperator{\Bin}{Bin} \newcommand{\wto}{\buildrel w\over \to} \]

Problem 12.1 Branching process

A branching process \(Z=\{Z_n : n\geq 0\}\) is constructed in the usual way. Thus, a family \(\{ X_k^{(n)} : n,k\geq 0\}\) of i.i.d. \(\Z_+\)-valued random variables is supposed given. We define \(Z_0 = 1\) and recursively

\[ \begin{equation} Z_{n+1} = X_1^{(n+1)} + \dots + X_{Z_n}^{(n+1)} \end{equation} \]

Assume that if \(X\) denotes any of the \(X_k^{(n)}\) then

\[ \begin{equation} \mu = \E X <\infty \qquad \text{and} \qquad 0<\sigma^2 = \Var(X) < \infty \end{equation} \]

Provet that \(M_n = Z_n /\mu^n\) defintes a martingale \(M\) relative to the filtration \(\mathcal F_n = \sigma(Z_0, Z_1, \dots, Z_n)\). Show that

\[ \begin{equation} \E(Z^2_{n+1} \mid \mathcal F_n ) = \mu^2 Z_n^2 + \sigma^2 Z_n \end{equation} \]

and deduce that \(M\) is bounded in \(L^2\) iff \(\mu>1\). Show that when \(\mu>1\)

\[ \begin{equation} \Var(M_\infty) = \frac{\sigma^2} {\mu (\mu-1)} \end{equation} \]

Note that since \(\E(X_k^{(n+1)} \mid \mathcal F_n ) = \E X_k^{(n+1)} = \mu\) and since \(Z_n\) is \(\mathcal F_n\)-measurable

\[ \begin{equation} \E(Z_{n+1} \mid \mathcal F_n ) = \sum_{k=1}^{Z_n} \E(X_k^{(n+1)}\mid \mathcal F_n) = Z_n \mu \end{equation} \]

From this it immediately follows that \(M_n\) is a martingale

\[ \begin{equation} \E( M_{n+1} \mid \mathcal F_n ) = \E( Z_{n+1}/\mu^{n+1} \mid \mathcal F_n ) = Z_n / \mu^n = M_n \end{equation} \]

Similarly since \(\Var( X_k^{(n+1)} \mid \mathcal F_n ) = \sigma^2\),

\[ \begin{equation} \Var( Z_{n+1} \mid \mathcal F_n ) = \sum_{k=1}^{Z_n} \Var(X_k^{(n+1)}\mid \mathcal F_n) = Z_n \sigma^2 \end{equation} \]

But by definition, \(\Var(Z_{n+1} \mid \mathcal F_n ) = \E( Z^2_n \mid \mathcal F_n ) - (\E( Z_{n+1} \mid \mathcal F_n) )^2\), so

\[ \begin{equation} \E( Z^2_n \mid \mathcal F_n ) = (\E( Z_{n+1} \mid \mathcal F_n) )^2 + Z_n \sigma^2 = Z_n^2 \mu^2 + Z_n \sigma^2 \end{equation} \]

From this we get

\[ \begin{equation} \E( M_{n+1} \mid \mathcal F_n ) = \frac{\mu^2 Z_n^2}{\mu^{2n+2}} + \frac{Z_n \sigma^2}{\mu^{2n+2}} = M^2_n + M_n \frac{\sigma^2} {\mu^{n+2}} \end{equation} \]

Thus

\[ \begin{equation} b_{n+1} = \E(M_{n+1}^2 - M_n^2) = \frac {\sigma^2}{\mu^{n+2}} \E M_n = \frac {\sigma^2}{\mu^{n+2}} \end{equation} \]

Note \(b_n\) is a geometric series and therefore \(\sum_n b_n < \infty\) iff \(\mu > 1\). By the theorem in section 12.1, \(M\) is bounded in \(L^2\) iff \( \sum_n b_n < \infty\), and in that case it converges almost surely to some random variable \(M_\infty\) with \(\E M_\infty^2 = \sum_n b_n + M^2_0\). When \(\mu>1\) we can sum the series

\[ \begin{equation} \Var(M_\infty)= \E M_\infty^2 - (\E M_\infty)^2 = \frac {\sigma^2/\mu^2} {1-\mu^{-1}} = \frac{\sigma^2}{\mu(\mu-1)} \end{equation} \]

since \(\E M_\infty = M_0 \)

Problem 12.2 Use of Kronecker's Lemma

Let \(E_1,E_2,\dots\) be independent events with \(\Pr(E_n) = 1/n\). Let \(Y_k = I_{E_k}\). Prove that \(\sum (Y_k - \frac 1 k)/ \log k\) converges a.s. and use Kronecker's Lemma to deduce that

\[ \begin{equation} \frac{N_n}{\log n} \to 1 \qquad \text{a.s.} \end{equation} \]

where \(N_n = Y_1+\dots+Y_n\)

First note \(\E Y_k = \Pr E_k = \frac 1 k\) and

\[ \begin{equation} \Var Y_k \leq \E Y_k^2 = \E I_{E_k}^2 = \E I_{E_k} = \Pr(E_k) = \frac 1 k \end{equation} \]

Consider the random variable \(Z_k = \frac{Y_k - k^{-1}}{\log k}\). Note that \(\E Z_k = 0\) and

\[ \begin{equation} \sigma^2_k =\Var Z_k = \frac {\Var( Y_k)} {(\log k)^2} \leq \frac 1 {k(\log k)^2} \end{equation} \]

Now \(\int \frac {dx} {x (\log x)^2} = -\frac 1 {\log x}\) so by the integral test, \(\sum_k \sigma^2_k < \infty\). Also the \(Z_k\) are independent, so by theorem 12.2,

\[ \begin{equation} \sum_k Z_k = \sum_k \left(Y_k - \frac 1 k \right) / \log k \end{equation} \]

converges almost surely. By Kronecker's lemma, this means

\[ \begin{equation} \frac{\sum_{k=1}^n Y_k - k^{-1}}{\log n} = \frac{ N_n - H_n } {\log n} \to 0 \qquad \text{a.s.} \end{equation} \]

where \(N_n = Y_1 + \dots + Y_n\) and \(H_n = 1 + \frac 1 2 + \dots + \frac 1 n\) is the harmonic series. Now \(H_n \to \log n +\gamma\) where \(\gamma\) is the Euler-Mascheroni constant. In particular \(H_n/\log n \to 1\). Therefore

\[ \begin{equation} \frac{N_n}{\log n} \to 1 \qquad \text{a.s.} \end{equation} \]

Note that the events in 4.3 for \(X_1,X_2,\dots\) drawn from i.i.d. continuous distribution that there is a ‘‘record’’ at time \(k\) satisfies the hypothesis of this problem. In 4.3 we show the events are independent, and that they have probability \(\frac 1 k\).

Problem 12.3 Star Trek 3

Prove that if the strategy in 10.11 is (in the obvious sense) employed – and for ever – in \(\R^3\) rather than in \(\R^2\), then

\[ \begin{equation} \sum R^{-2}_n < \infty \qquad \text{a.s.} \end{equation} \]

where \(R_n\) is the distance from the Enterprise to the Sun at time \(n\).

Let \(E_n\) be the event that the Enterprise returns to the solar system for the first time at time \(n\). The area of a cap on a sphere with central angle \(\theta\) is \(2\pi R^2 (1-\cos \theta)\) and therefore the probability of a jump landing in that cap is \(\frac 1 2 (1-\cos \theta)\). The area formula can be seen by integrating the surface area element \(R^2 \sin \theta \, d\theta d\phi\) or by using Archemedes observation that the projection of an sphere onto an enclosing cylendar preserves surface area. For the jumps strategy described in this problem, the radius of the jump sphere for jump \(n\) is \(R_{n-1}\), the distance from the Sun. The probabilty we enter the solar system corresponds to the cap whose edge is distance \(r\) from the sun. Thus there is an isoceles triangle with equal sides \(R\) and base \(r\), and the opposite angle of the base is \(\theta\). By the law of cosines, \(\rho^2 = 2R^2 - 2R^2 \cos \theta\). Let \(T\) be the stopping time that the Enterprise returns to the solar system \(T= \inf \{R_n \leq r\}\). This implies

\[ \begin{equation} \xi_{k} = \Pr( E_{k} \mid \mathcal F_{k-1}) = \begin{cases} \frac {\rho^2}{4R_{k-1}^2} & k \leq T \\ 0 & \text{otherwise} \end{cases} \end{equation} \]

Using the notation of theorem 12.15, \(Z_n = \sum_{k\leq n} I_{E_n} \leq 1\) for all \(n\). Let \(Y_\infty = \sum_{k =1}^\infty \xi_k = \sum_{k=1}^T \frac {\rho^2}{4R_{k-1}^2}\). If \(Y_\infty = \infty\) then 12.15b implies that \(\lim_{n\to\infty} Z_n / Y_n = 1\), but this is a contradiction because \(Z_n\) is bounded. Therefore \(Y_\infty < \infty\) almost surely.

TODO: to clinch the argument I want to take \(\rho \to 0\) and \(T\to \infty\), but the sum also gets bigger in this limit since it includes the terms where \(R_n\) is really close to 0.

Contact

For comments or corrections please contact Ryan McCorvie at ryan@martingale.group