11.4.3 Solved Problems

Problem

Let $W(t)$ be a standard Brownian motion. Find $P(W(1)+W(2)>2)$.

  • Solution
    • Let $X=W(1)+W(2)$. Since $W(t)$ is a Gaussian process, $X$ is a normal random variable. \begin{align*} EX=E[W(1)]+E[W(2)]=0, \end{align*} \begin{align*} \textrm{Var}(X)&=\textrm{Var}\big(W(1)\big)+\textrm{Var}\big(W(2)\big)+2 \textrm{Cov} \big(W(1),W(2)\big)\\ &=1+2+2 \cdot 1\\ &=5. \end{align*} We conclude $$X \sim N(0,5).$$ Thus, \begin{align*} P(X>2)&=1-\Phi\left(\frac{2-0}{\sqrt{5}}\right)\\ &\approx 0.186 \end{align*}


Problem

Let $W(t)$ be a standard Brownian motion, and $0 \leq s \lt t$. Find the conditional PDF of $W(s)$ given $W(t)=a$.

  • Solution
    • It is useful to remember the following result from the previous chapters: Suppose $X$ and $Y$ are jointly normal random variables with parameters $\mu_X$, $\sigma^2_X$, $\mu_Y$, $\sigma^2_Y$, and $\rho$. Then, given $X=x$, $Y$ is normally distributed with \begin{align}%\label{} \nonumber &E[Y|X=x]=\mu_Y+ \rho \sigma_Y \frac{x-\mu_X}{\sigma_X},\\ \nonumber &\textrm{Var}(Y|X=x)=(1-\rho^2)\sigma^2_Y. \end{align} Now, if we let $X=W(t)$ and $Y=W(s)$, we have $X \sim N(0,t)$ and $Y \sim N(0,s)$ and \begin{align*} \rho &=\frac{\textrm{Cov}(X,Y)}{\sigma_x \sigma_Y}\\ &=\frac{\min(s,t)}{\sqrt{t} \sqrt{s}} \\ &=\frac{s}{\sqrt{t} \sqrt{s}}\\ &=\sqrt{\frac{s}{t}}. \end{align*} We conclude that \begin{align}%\label{} \nonumber &E[Y|X=a]= \frac{s}{t} a,\\ \nonumber &\textrm{Var}(Y|X=a)=s\left(1-\frac{s}{t}\right). \end{align} Therefore, \begin{align*} W(s) | W(t)=a \; \sim \; N\left(\frac{s}{t} a, s\left(1-\frac{s}{t}\right) \right). \end{align*}


Problem

(Geometric Brownian Motion) Let $W(t)$ be a standard Brownian motion. Define \begin{align*} X(t)=\exp \{W(t)\}, \quad \textrm{for all t } \in [0,\infty). \end{align*}

  1. Find $E[X(t)]$, for all $t \in [0,\infty)$.
  2. Find $\textrm{Var}(X(t))$, for all $t \in [0,\infty)$.
  3. Let $0 \leq s \leq t$. Find $\textrm{Cov}(X(s),X(t))$.
  • Solution
    • It is useful to remember the MGF of the normal distribution. In particular, if $X \sim N(\mu, \sigma)$, then \begin{align} M_X(s)=E[e^{sX}]=\exp\left\{s \mu + \frac{\sigma^2 s^2}{2}\right\}, \quad \quad \textrm{for all} \quad s\in \mathbb{R}. \end{align}
      1. We have \begin{align*} E[X(t)]&=E[e^{W(t)}], &(\textrm{where }W(t) \sim N(0,t))\\ &=\exp \left\{\frac{t}{2}\right\}. \end{align*}
      2. We have \begin{align*} E[X^2(t)]&=E[e^{2W(t)}], &(\textrm{where }W(t) \sim N(0,t))\\ &=\exp \{2t\}. \end{align*} Thus, \begin{align*} \textrm{Var}(X(t))&=E[X^2(t)]-E[X(t)]^2\\ &=\exp \{2t\}-\exp \{ t\}. \end{align*}
      3. Let $0 \leq s \leq t$. Then, we have \begin{align*} \textrm{Cov}(X(s),X(t))&=E[X(s)X(t)]-E[X(s)]E[X(t)]\\ &=E[X(s)X(t)]-\exp \left\{\frac{s+t}{2}\right\}. \end{align*} To find $E[X(s)X(t)]$, we can write \begin{align*} E[X(s)X(t)]&=E\bigg[\exp \left\{W(s)\right\} \exp \left\{W(t)\right\} \bigg]\\ &=E\bigg[\exp \left\{W(s) \right\} \exp \left\{W(s)+W(t)-W(s)\right\} \bigg]\\ &=E\bigg[\exp \left\{2W(s) \right\} \exp \left\{W(t)-W(s)\right\} \bigg]\\ &=E\bigg[\exp \left\{2W(s) \right\} \bigg] E\bigg[\exp \left\{W(t)-W(s)\right\} \bigg]\\ &=\exp \left\{2s\right\} \exp \left\{\frac{t-s}{2}\right\}\\ &=\exp \left\{\frac{3s+t}{2}\right\}. \end{align*} We conclude, for $0 \leq s \leq t$, \begin{align*} \textrm{Cov}(X(s),X(t))&=\exp \left\{\frac{3s+t}{2}\right\}-\exp \left\{\frac{s+t}{2}\right\}. \end{align*}




The print version of the book is available on Amazon.

Book Cover


Practical uncertainty: Useful Ideas in Decision-Making, Risk, Randomness, & AI

ractical Uncertaintly Cover