Probability with Martingales by David Williams\[ \newcommand{\Q}{\mathbb Q} \newcommand{\R}{\mathbb R} \newcommand{\C}{\mathbb C} \newcommand{\Z}{\mathbb Z} \newcommand{\N}{\mathbb N} \newcommand{\abs}[1]{\lvert #1 \rvert} \newcommand{\norm}[1]{\lVert #1 \rVert} \newcommand{\abs}[1]{\lvert #1 \rvert} \newcommand{\Norm}[1]{\left \lVert #1 \right \rVert} \newcommand{\Abs}[1]{\left \lvert #1 \right \rvert} \newcommand{\pind}{\bot \!\!\! \bot} \newcommand{\probto}{\buildrel P\over \to} \newcommand{\vect}[1]{\boldsymbol #1} \DeclareMathOperator{\EE}{\mathbb E} \DeclareMathOperator{\PP}{\mathbb P} \DeclareMathOperator{\E}{E} \DeclareMathOperator{\dnorm}{\mathcal N} \DeclareMathOperator{\sgn}{sgn} \DeclareMathOperator{\Var}{Var} \DeclareMathOperator{\Cov}{Cov} \DeclareMathOperator{\Leb}{Leb} \DeclareMathOperator{\Bin}{Bin} \newcommand{\wto}{\buildrel w\over \to} \] Problem 12.1 Branching process
A branching process \(Z=\{Z_n : n\geq 0\}\) is constructed in the usual way. Thus, a family \(\{ X_k^{(n)} : n,k\geq 0\}\) of i.i.d. \(\Z_+\)-valued random variables is supposed given. We define \(Z_0 = 1\) and recursively \[ \begin{equation} Z_{n+1} = X_1^{(n+1)} + \dots + X_{Z_n}^{(n+1)} \end{equation} \] Assume that if \(X\) denotes any of the \(X_k^{(n)}\) then \[ \begin{equation} \mu = \E X <\infty \qquad \text{and} \qquad 0<\sigma^2 = \Var(X) < \infty \end{equation} \] Provet that \(M_n = Z_n /\mu^n\) defintes a martingale \(M\) relative to the filtration \(\mathcal F_n = \sigma(Z_0, Z_1, \dots, Z_n)\). Show that \[ \begin{equation} \E(Z^2_{n+1} \mid \mathcal F_n ) = \mu^2 Z_n^2 + \sigma^2 Z_n \end{equation} \] and deduce that \(M\) is bounded in \(L^2\) iff \(\mu>1\). Show that when \(\mu>1\) \[ \begin{equation} \Var(M_\infty) = \frac{\sigma^2} {\mu (\mu-1)} \end{equation} \] Note that since \(\E(X_k^{(n+1)} \mid \mathcal F_n ) = \E X_k^{(n+1)} = \mu\) and since \(Z_n\) is \(\mathcal F_n\)-measurable \[ \begin{equation} \E(Z_{n+1} \mid \mathcal F_n ) = \sum_{k=1}^{Z_n} \E(X_k^{(n+1)}\mid \mathcal F_n) = Z_n \mu \end{equation} \] From this it immediately follows that \(M_n\) is a martingale \[ \begin{equation} \E( M_{n+1} \mid \mathcal F_n ) = \E( Z_{n+1}/\mu^{n+1} \mid \mathcal F_n ) = Z_n / \mu^n = M_n \end{equation} \] Similarly since \(\Var( X_k^{(n+1)} \mid \mathcal F_n ) = \sigma^2\), \[ \begin{equation} \Var( Z_{n+1} \mid \mathcal F_n ) = \sum_{k=1}^{Z_n} \Var(X_k^{(n+1)}\mid \mathcal F_n) = Z_n \sigma^2 \end{equation} \] But by definition, \(\Var(Z_{n+1} \mid \mathcal F_n ) = \E( Z^2_n \mid \mathcal F_n ) - (\E( Z_{n+1} \mid \mathcal F_n) )^2\), so \[ \begin{equation} \E( Z^2_n \mid \mathcal F_n ) = (\E( Z_{n+1} \mid \mathcal F_n) )^2 + Z_n \sigma^2 = Z_n^2 \mu^2 + Z_n \sigma^2 \end{equation} \] From this we get \[ \begin{equation} \E( M_{n+1} \mid \mathcal F_n ) = \frac{\mu^2 Z_n^2}{\mu^{2n+2}} + \frac{Z_n \sigma^2}{\mu^{2n+2}} = M^2_n + M_n \frac{\sigma^2} {\mu^{n+2}} \end{equation} \] Thus \[ \begin{equation} b_{n+1} = \E(M_{n+1}^2 - M_n^2) = \frac {\sigma^2}{\mu^{n+2}} \E M_n = \frac {\sigma^2}{\mu^{n+2}} \end{equation} \] Note \(b_n\) is a geometric series and therefore \(\sum_n b_n < \infty\) iff \(\mu > 1\). By the theorem in section 12.1, \(M\) is bounded in \(L^2\) iff \( \sum_n b_n < \infty\), and in that case it converges almost surely to some random variable \(M_\infty\) with \(\E M_\infty^2 = \sum_n b_n + M^2_0\). When \(\mu>1\) we can sum the series \[ \begin{equation} \Var(M_\infty)= \E M_\infty^2 - (\E M_\infty)^2 = \frac {\sigma^2/\mu^2} {1-\mu^{-1}} = \frac{\sigma^2}{\mu(\mu-1)} \end{equation} \] since \(\E M_\infty = M_0 \) Problem 12.2 Use of Kronecker's Lemma
Let \(E_1,E_2,\dots\) be independent events with \(\Pr(E_n) = 1/n\). Let \(Y_k = I_{E_k}\). Prove that \(\sum (Y_k - \frac 1 k)/ \log k\) converges a.s. and use Kronecker's Lemma to deduce that \[ \begin{equation} \frac{N_n}{\log n} \to 1 \qquad \text{a.s.} \end{equation} \] where \(N_n = Y_1+\dots+Y_n\) First note \(\E Y_k = \Pr E_k = \frac 1 k\) and \[ \begin{equation} \Var Y_k \leq \E Y_k^2 = \E I_{E_k}^2 = \E I_{E_k} = \Pr(E_k) = \frac 1 k \end{equation} \] Consider the random variable \(Z_k = \frac{Y_k - k^{-1}}{\log k}\). Note that \(\E Z_k = 0\) and \[ \begin{equation} \sigma^2_k =\Var Z_k = \frac {\Var( Y_k)} {(\log k)^2} \leq \frac 1 {k(\log k)^2} \end{equation} \] Now \(\int \frac {dx} {x (\log x)^2} = -\frac 1 {\log x}\) so by the integral test, \(\sum_k \sigma^2_k < \infty\). Also the \(Z_k\) are independent, so by theorem 12.2, \[ \begin{equation} \sum_k Z_k = \sum_k \left(Y_k - \frac 1 k \right) / \log k \end{equation} \] converges almost surely. By Kronecker's lemma, this means \[ \begin{equation} \frac{\sum_{k=1}^n Y_k - k^{-1}}{\log n} = \frac{ N_n - H_n } {\log n} \to 0 \qquad \text{a.s.} \end{equation} \] where \(N_n = Y_1 + \dots + Y_n\) and \(H_n = 1 + \frac 1 2 + \dots + \frac 1 n\) is the harmonic series. Now \(H_n \to \log n +\gamma\) where \(\gamma\) is the Euler-Mascheroni constant. In particular \(H_n/\log n \to 1\). Therefore \[ \begin{equation} \frac{N_n}{\log n} \to 1 \qquad \text{a.s.} \end{equation} \] Note that the events in 4.3 for \(X_1,X_2,\dots\) drawn from i.i.d. continuous distribution that there is a ‘‘record’’ at time \(k\) satisfies the hypothesis of this problem. In 4.3 we show the events are independent, and that they have probability \(\frac 1 k\). Problem 12.3 Star Trek 3
Prove that if the strategy in 10.11 is (in the obvious sense) employed – and for ever – in \(\R^3\) rather than in \(\R^2\), then \[ \begin{equation} \sum R^{-2}_n < \infty \qquad \text{a.s.} \end{equation} \] where \(R_n\) is the distance from the Enterprise to the Sun at time \(n\). This solution is by Liang Zhao. Let \(V_n = X_n / R_n\) be the unit direction vector of the position \(X_n\). From \(X_n = X_{n-1} + R_{n-1} U_n\) we obtain \[ \begin{equation} R_n^2 = | X_{n-1} + R_{n-1} U_n |^2 = R_{n-1}^2 \, | V_{n-1} + U_n |^2 = R_{n-1}^2 \bigl( 2 + 2\, V_{n-1} \cdot U_n \bigr). \end{equation} \] Write \(Y_n = V_{n-1} \cdot U_n\). Let \(f:\mathbb{R}\to\mathbb{R}\) be any bounded measurable function. Since \(V_{n-1}\) is \(\mathcal F_{n-1}\)-measurable and \(U_n\) is independent of \(\mathcal F_{n-1}\), by the freezing lemma we have \[ \begin{equation} \mathbb{E}\!\left[ f\bigl( V_{n-1} \cdot U_n \bigr) \,\middle|\, \mathcal F_{n-1} \right] = g(V_{n-1}),\qquad g(v):=\mathbb E\bigl[ f( v \cdot U_n ) \bigr]. \end{equation} \] Because \(U_n\) is uniformly distributed on the unit sphere and \(v\) is a fixed unit vector, the distribution of \(v \cdot U_n\) is rotation invariant. Taking \(v=(0,0,1)\), \(v \cdot U_n\) is just the \(z\)-coordinate of \(U_n\), which is uniformly distributed on \([-1,1]\). Hence \[ \begin{equation} g(v) = \frac{1}{2} \int_{-1}^{1} f(z)\,dz . \end{equation} \] Therefore, for every bounded measurable \(f\), \[ \begin{equation} \mathbb{E}\!\left[ f(Y_n) \,\middle|\, \mathcal{F}_{n-1} \right] = \mathbb{E}\bigl[ f(Y_n) \bigr], \end{equation} \] so \(Y_n\) is independent of \(\mathcal{F}_{n-1}\). Consequently, \(\{Y_n\}\) is an i.i.d. sequence (each \(Y_n \sim \mathrm{Unif}[-1,1]\)). Taking logarithms and iterating, we get \[ \begin{equation} \log R_n^2 = \log R_0^2 + \sum_{k=1}^n \log\bigl( 2 + 2 Y_k \bigr). \end{equation} \] By the strong law of large numbers, \[ \begin{equation} \frac{1}{n}\sum_{k=1}^n \log\bigl( 2 + 2 Y_k \bigr) \;\xrightarrow{\ \text{a.s.}\ }\; \mathbb{E}\!\left[ \log\bigl( 2 + 2 Y_1 \bigr) \right] = \frac12 \int_{-1}^{1} \log(2+2z)\,dz = \log 4 - 1 \;>\; 0 . \end{equation} \] Fix any \(0<\alpha<\mu:=\log 4 - 1\). Then for almost every \(\omega\) there exists a (random) \(N(\omega)\) such that for all \(n \ge N(\omega)\), \[ \begin{equation} \sum_{k=1}^n \log\bigl( 2 + 2 Y_k \bigr) \ \ge\ \alpha n . \end{equation} \] Hence, for almost every \(\omega\) and all sufficiently large \(n\), \[ \begin{equation} R_n^2 = R_0^2 \exp\!\left( \sum_{k=1}^n \log\bigl( 2 + 2 Y_k \bigr) \right) \ \ge\ R_0^2 e^{\alpha n}. \end{equation} \] It follows that \[ \begin{equation} \frac{1}{R_n^2}\ \le\ \frac{1}{R_0^2}\,e^{-\alpha n} \quad\text{for all sufficiently large } n, \end{equation} \] and therefore \(\sum_{n=1}^\infty \frac{1}{R_n^2} < \infty\) almost surely. ContactFor comments or corrections please contact Ryan McCorvie at ryan@martingale.group |