11.5.0 End of Chapter Problems

Problem

The number of orders arriving at a service facility can be modeled by a Poisson process with intensity $\lambda=10$ orders per hour.

  1. Find the probability that there are no orders between 10:30 and 11.
  2. Find the probability that there are $3$ orders between 10:30 and 11 and $7$ orders between 11:30 and 12.




Problem

Let $\{N(t), t \in [0, \infty) \}$ be a Poisson process with rate $\lambda$. Find the probability that there are two arrivals in $(0,2]$ or three arrivals in $(4,7]$.




Problem

Let $X \sim Poisson(\mu_1)$ and $Y \sim Poisson(\mu_2)$ be two independent random variables. Define $Z=X+Y$. Show that $$X|Z=n \sim Binomial\left(n,\frac{\mu_1}{\mu_1+\mu_2}\right).$$




Problem

Let $N(t)$ be a Poisson process with rate $\lambda$. Let $0 \lt s \lt t$. Show that given $N(t)=n$, $N(s)$ is a binomial random variable with parameters $n$ and $p=\frac{s}{t}$.




Problem

Let $N_1(t)$ and $N_2(t)$ be two independent Poisson processes with rate $\lambda_1$ and $\lambda_2$ respectively. Let $N(t)=N_1(t)+N_2(t)$ be the merged process. Show that given $N(t)=n$, $N_1(t) \sim Binomial\left(n,\frac{\lambda_1}{\lambda_1+\lambda_2}\right)$.

Note: We can interpret this result as follows: Any arrival in the merged process belongs to $N_1(t)$ with probability $\frac{\lambda_1}{\lambda_1+\lambda_2}$ and belongs to $N_2(t)$ with probability $\frac{\lambda_2}{\lambda_1+\lambda_2}$ independent of other arrivals.


Problem

In this problem, our goal is to complete the proof of the equivalence of the first and the second definitions of the Poisson process. More specifically, suppose that the counting process $\{N(t), t \in [0, \infty)\}$ satisfies all the following conditions:

  1. $N(0)=0$.
  2. $N(t)$ has $\underline{independent}$ and $\underline{stationary}$ increments.
  3. We have \begin{align*} &P(N(\Delta)=0) =1-\lambda \Delta+ o(\Delta),\\ &P(N(\Delta)=1)=\lambda \Delta+o(\Delta),\\ &P(N(\Delta) \geq 2)=o(\Delta). \end{align*}
We would like to show that $N(t) \sim Poisson (\lambda t)$. To this, for any $k \in \{0,1,2,\cdots\}$, define the function \begin{align*} g_k(t) =P(N(t)=k). \end{align*}
  1. Show that for any $\Delta>0$, we have \begin{align*} g_0(t+\Delta)=g_0(t) [1-\lambda \Delta +o(\Delta)]. \end{align*}
  2. Using Part (a), show that \begin{align*} \frac{g_0'(t)}{g_0(t)}=-\lambda. \end{align*}
  3. By solving the above differential equation and using the fact that $g_0(0)=1$, conclude that \begin{align*} g_0(t)=e^{-\lambda t}. \end{align*}
  4. For $k\geq 1$, show that \begin{align*} g_k(t+\Delta)=g_k(t)(1-\lambda \Delta) +g_{k-1}(t)\lambda \Delta+o(\Delta). \end{align*}
  5. Using the previous part show that \begin{align*} g_k'(t)=-\lambda g_k(t)+\lambda g_{k-1}(t), \end{align*} which is equivalent to \begin{align*} \frac{d}{dt} \bigg[ e^{\lambda t} g_k(t)\bigg]=\lambda e^{\lambda t}g_{k-1}(t). \end{align*}
  6. Check that the function \begin{align*} g_k(t)=\frac{e^{-\lambda t} (\lambda t)^k}{k!} \end{align*} satisfies the above differential equation for any $k \geq 1$. In fact, this is the only solution that satisfies $g_0(t)=e^{-\lambda t}$, and $g_k(0)=0$ for $k\geq 1$.



Problem

Let $\{N(t), t \in [0, \infty) \}$ be a Poisson process with rate $\lambda$. Let $T_1$, $T_2$, $\cdots$ be the arrival times for this process. Show that \begin{align*} f_{T_1,T_2,...,T_n}(t_1,t_2,\cdots,t_n)=\lambda^n e^{-\lambda t_n}, \quad \textrm{ for }0 \lt t_1 \lt t_2 \lt \cdots \lt t_n. \end{align*} Hint: One way to show the above result is to show that for sufficiently small $\Delta_i$, we have \begin{align*} &P\bigg(t_1 \leq T_1 \lt t_1+\Delta_1, t_2 \leq T_2 \lt t_2+\Delta_2,..., t_n \leq T_n \lt t_n+\Delta_n\bigg)\approx\\ &\hspace{50pt} \lambda^n e^{-\lambda t_n} \Delta_1 \Delta_2 \cdots \Delta_n , \quad \textrm{ for }0 \lt t_1 \lt t_2 \lt \cdots \lt t_n. \end{align*}




Problem

Let $\{N(t), t \in [0, \infty) \}$ be a Poisson process with rate $\lambda$. Show the following: given that $N(t)=n$, the $n$ arrival times have the same joint CDF as the order statistics of $n$ independent $Uniform(0,t)$ random variables. To show this you can show that \begin{align*} f_{T_1,T_2,...,T_n|N(t)=n}(t_1,t_2,\cdots,t_n)=\frac{n!}{t^n}, \quad \textrm{ for }0 \lt t_1 \lt t_2 \lt \cdots \lt t_n \lt t. \end{align*}




Problem

Let $\{N(t), t \in [0, \infty) \}$ be a Poisson process with rate $\lambda$. Let $T_1$, $T_2$, $\cdots$ be the arrival times for this process. Find \begin{align*} E[T_1+T_2+\cdots+T_{10} | N(4)=10]. \end{align*} Hint: Use the result of Problem 8.




Problem

Two teams $A$ and $B$ play a soccer match. The number of goals scored by Team $A$ is modeled by a Poisson process $N_1(t)$ with rate $\lambda_1=0.02$ goals per minute, and the number of goals scored by Team $B$ is modeled by a Poisson process $N_2(t)$ with rate $\lambda_2=0.03$ goals per minute. The two processes are assumed to be independent. Let $N(t)$ be the total number of goals in the game up to and including time $t$. The game lasts for $90$ minutes.

  1. Find the probability that no goals are scored, i.e., the game ends with a $0$-$0$ draw.
  2. Find the probability that at least two goals are scored in the game.
  3. Find the probability of the final score being $$\textrm{Team }A: 1, \quad \textrm{Team }B:2 $$
  4. Find the probability that they draw.




Problem

In Problem 10, find the probability that Team $B$ scores the first goal. That is, find the probability that at least one goal is scored in the game and the first goal is scored by Team $B$.




Problem

Let $\{N(t), t \in [0, \infty) \}$ be a Poisson process with rate $\lambda$. Let $p:[0,\infty) \mapsto [0,1]$ be a function. Here we divide $N(t)$ to two processes $N_1(t)$ and $N_2(t)$ in the following way. For each arrival, a coin with $P(H)=p(t)$ is tossed. If the coin lands heads up, the arrival is sent to the first process ($N_1(t)$), otherwise it is sent to the second process. The coin tosses are independent of each other and are independent of $N(t)$. Show that $N_1(t)$ is a nonhomogeneous Poisson process with rate $\lambda(t)=\lambda p(t)$.




Problem

Consider the Markov chain with three states $S=\{1, 2, 3 \}$, that has the state transition diagram is shown in Figure 11.31.

MC-diagram-EOC-1
Figure 11.31 - A state transition diagram.
Suppose $P(X_1=1)=\frac{1}{2}$ and $P(X_1=2)=\frac{1}{4}$.
  1. Find the state transition matrix for this chain.
  2. Find $P(X_1=3,X_2=2,X_3=1)$.
  3. Find $P(X_1=3,X_3=1)$.



Problem

Let $\alpha_0$, $\alpha_1$, $\cdots$ be a sequence of nonnegative numbers such that \begin{align*} \sum_{j=0}^{\infty} \alpha_j=1. \end{align*} Consider a Markov chain $X_0$, $X_1$, $X_2$, $\cdots$ with the state space $S=\{0,1,2,\cdots \}$ such that \begin{align*} p_{ij}=\alpha_j, \quad \textrm{ for all }j \in S. \end{align*} Show that $X_1$, $X_2$, $\cdots$ is a sequence of i.i.d random variables.




Problem

Let $X_n$ be a discrete-time Markov chain. Remember that, by definition, $p^{(n)}_{ii}=P(X_n=i | X_0=i)$. Show that state $i$ is recurrent if and only if $$\sum_{n=1}^{\infty}p^{(n)}_{ii}=\infty.$$




Problem

Consider the Markov chain in Figure 11.32. There are two recurrent classes, $R_1=\{1,2\}$, and $R_2=\{5,6,7\}$. Assuming $X_0=4$, find the probability that the chain gets absorbed to $R_1$.

DTMC-EOC-2
Figure 11.32 - A state transition diagram.



Problem

Consider the Markov chain of Problem 16. Again assume $X_0=4$. We would like to find the expected time (number of steps) until the chain gets absorbed in $R_1$ or $R_2$. More specifically, let $T$ be the absorption time, i.e., the first time the chain visits a state in $R_1$ or $R_2$. We would like to find $E[T|X_0=4]$.




Problem

Consider the Markov chain shown in Figure 11.33. Assume $X_0=2$, and let $N$ be the first time that the chain returns to state $2$, i.e., \begin{align*} N = \min \{n \geq 1: X_n=2 \}. \end{align*} Find $E[N|X_0=2]$.

EOC-MRT
Figure 11.33 - A state transition diagram.



Problem

Consider the Markov chain shown in Figure 11.34.

MC-this
Figure 11.34 - A state transition diagram.
  1. Is this chain irreducible?
  2. Is this chain aperiodic?
  3. Find the stationary distribution for this chain.
  4. Is the stationary distribution a limiting distribution for the chain?



Problem

(Random Walk) Consider the Markov chain shown in Figure 11.35.

random-walk
Figure 11.35 - Simple random walk.
This is known as the simple random walk. Show that \begin{align*} &p_{00}^{(2n)}={2n \choose n}p^n(1-p)^n,\\ &p_{00}^{(2n+1)}=0. \end{align*} Note: Using Stirling's formula, it can be shown that \begin{align*} \sum_{k=1}^{\infty}p^{(k)}_{00}=\sum_{n=1}^{\infty} {2n \choose n}p^n(1-p)^n \end{align*} is finite if and only if $p \neq \frac{1}{2}$. Thus, we conclude that the simple random walk is recurrent if $p=\frac{1}{2}$ and is transient if $p \neq \frac{1}{2}$ (see Problem 15).


Problem

Consider the Markov chain shown in Figure 11.36. Assume that $0 \lt p \lt q$. Does this chain have a limiting distribution? For all $i,j \in \{0,1,2, \cdots \}$, find \begin{align*} \lim_{n \rightarrow \infty} P(X_n=j |X_0=i). \end{align*}

MC-diagram-inf-3
Figure 11.36 - A state transition diagram.



Problem

Consider the Markov chain shown in Figure 11.37. Assume that $p>q>0$. Does this chain have a limiting distribution? For all $i,j \in \{0,1,2, \cdots \}$, find \begin{align*} \lim_{n \rightarrow \infty} P(X_n=j |X_0=i). \end{align*}

MC-diagram-inf-4
Figure 11.37 - A state transition diagram.



Problem

(Gambler's Ruin Problem) Two gamblers, call them Gambler A and Gambler B, play repeatedly. In each round, A wins $1$ dollar with probability $p$ or loses $1$ dollar with probability $q=1-p$ (thus, equivalently, in each round B wins $1$ dollar with probability $q=1-p$ and loses $1$ dollar with probability $p$). We assume different rounds are independent. Suppose that initially A has $i$ dollars and B has $N-i$ dollars. The game ends when one of the gamblers runs out of money (in which case the other gambler will have $N$ dollars). Our goal is to find $p_i$, the probability that A wins the game given that he has initially $i$ dollars.

  1. Define a Markov chain as follows: The chain is in state $i$ if the Gambler A has $i$ dollars. Here, the state space is $S=\{0,1,\cdots, N\}$. Draw the state transition diagram of this chain.
  2. Let $a_i$ be the probability of absorption to state $N$ (the probability that A wins) given that $X_0=i$. Show that \begin{align*} a_0&=0, \\ a_N&=1,\\ a_{i+1}-a_{i}&=\frac{q}{p} (a_{i}-a_{i-1}), \quad \textrm{ for }i=1,2,\cdots, N-1. \end{align*}
  3. Show that \begin{align*} a_{i}&=\left[1+\frac{q}{p}+ \left(\frac{q}{p}\right)^2+\cdots+ \left(\frac{q}{p}\right)^{i-1}\right] a_1, \textrm{ for }i=1,2,\cdots, N. \end{align*}
  4. Find $a_i$ for any $i \in \{0,1,2, \cdots, N\}$. Consider two cases: $p=\frac{1}{2}$ and $p\neq \frac{1}{2}$.



Problem

Let $N=4$ and $i=2$ in the gambler's ruin problem (Problem 23). Find the expected number of rounds the gamblers play until one of them wins the game.




Problem

The Poisson process is a continuous-time Markov chain. Specifically, let $N(t)$ be a Poisson process with rate $\lambda$.

  1. Draw the state transition diagram of the corresponding jump chain.
  2. What are the rates $\lambda_i$ for this chain?




Problem

Consider a continuous-time Markov chain $X(t)$ that has the jump chain shown in Figure 11.38. Assume $\lambda_1= \lambda_2=\lambda_3$, and $\lambda_4=2 \lambda_1$.

jump-chain-4
Figure 11.38 - The jump chain for the Markov chain of Problem 26.
  1. Find the stationary distribution of the jump chain $\tilde{\pi}=\big[ \tilde{\pi}_1, \tilde{\pi}_2, \tilde{\pi}_3, \tilde{\pi}_4 \big]$.
  2. Using $\tilde{\pi}$, find the stationary distribution for $X(t)$.



Problem

Consider a continuous-time Markov chain $X(t)$ that has the jump chain shown in Figure 11.39. Assume $\lambda_1=1$, $\lambda_2=2$, and $\lambda_3=4$.

  1. Find the generator matrix for this chain.
  2. Find the limiting distribution for $X(t)$ by solving $\pi G=0$.

jump-chain-5
Figure 11.39 - The jump chain for the Markov chain of Problem 27.



Problem

Consider the queuing system of Problem 3 in the Solved Problems Section (Section 3.4). Specifically, in that problem we found the following generator matrix and transition rate diagram:

\begin{equation} \nonumber G = \begin{bmatrix} -\lambda & \lambda & 0 & 0 & \cdots \\[5pt] \mu & -(\mu+\lambda) & \lambda & 0 & \cdots \\[5pt] 0 & \mu & -(\mu+\lambda) & \lambda & \cdots \\[5pt] \vdots & \vdots & \vdots & \vdots \end{bmatrix}. \end{equation} The transition rate diagram is shown in Figure 11.40
queue-transition-1
Figure 11.40 - The transition rate diagram for the above queuing system.
Assume that $0 \lt \lambda \lt \mu$. Find the stationary distribution for this queueing system.


Problem

Let $W(t)$ be the standard Brownian motion.

  1. Find $P(-1 \lt W(1) \lt 1)$.
  2. Find $P(1 \lt W(2)+W(3) \lt 2)$.
  3. Find $P(W(1)>2 | W(2)=1)$.




Problem

Let $W(t)$ be a standard Brownian motion. Find $$P\big(0 \lt W(1)+W(2) \lt 2,3W(1)-2W(2)>0\big).$$




Problem

(Brownian Bridge) Let $W(t)$ be a standard Brownian motion. Define \begin{align*} X(t)=W(t)-tW(1), \quad \textrm{ for all }t \in [0,\infty). \end{align*} Note that $X(0)=X(1)=0$. Find $\textrm{Cov}(X(s),X(t))$, for $0\leq s \leq t \leq 1$.




Problem

(Correlated Brownian Motions) Let $W(t)$ and $U(t)$ be two independent standard Brownian motions. Let $-1 \leq \rho \leq 1$. Define the random process $X(t)$ as \begin{align*} X(t)=\rho W(t) + \sqrt{1-\rho^2} U(t), \quad \textrm{ for all }t \in [0,\infty). \end{align*}

  1. Show that $X(t)$ is a standard Brownian motion.
  2. Find the covariance and correlation coefficient of $X(t)$ and $W(t)$. That is, find $\textrm{Cov}(X(t),W(t))$ and $\rho(X(t),W(t))$.




Problem

(Hitting Times for Brownian Motion) Let $W(t)$ be a standard Brownian motion. Let $a>0$. Define $T_a$ as the first time that $W(t)=a$. That is \begin{align*} T_a=\min \{ t: W(t)=a\}. \end{align*}

  1. Show that for any $t \geq 0$, we have $$P(W(t) \geq a )= P(W(t) \geq a | T_a \leq t)P(T_a \leq t).$$
  2. Using Part (a), show that $$P(T_a \leq t)=2\left[1-\Phi\left(\frac{a}{\sqrt{t}}\right)\right].$$
  3. Using Part (b), show that the PDF of $T_a$ is given by $$f_{T_a}(t)=\frac{a}{t\sqrt{2 \pi t}} \exp \left\{-\frac{a^2}{2t}\right\}.$$ Note: By symmetry of Brownian motion, we conclude that for any $a \neq 0$, we have $$f_{T_a}(t)=\frac{|a|}{t\sqrt{2 \pi t}} \exp \left\{-\frac{a^2}{2t}\right\}.$$





The print version of the book is available on Amazon.

Book Cover


Practical uncertainty: Useful Ideas in Decision-Making, Risk, Randomness, & AI

ractical Uncertaintly Cover