11.1.3 Merging and Splitting Poisson Processes

Merging Independent Poisson Processes:

Let $N_1(t)$ and $N_2(t)$ be two independent Poisson processes with rates $\lambda_1$ and $\lambda_2$ respectively. Let us define $N(t)=N_1(t)+N_2(t)$. That is, the random process $N(t)$ is obtained by combining the arrivals in $N_1(t)$ and $N_2(t)$ (Figure 11.5). We claim that $N(t)$ is a Poisson process with rate $\lambda=\lambda_1+\lambda_2$. To see this, first note that \begin{align*} N(0) &=N_1(0)+N_2(0) \\ &=0+0=0. \end{align*}
Poisson-merge
Figure 11.5 - Merging two Poisson processes $N_1(t)$ and $N_2(t)$.

Next, since $N_1(t)$ and $N_2(t)$ are independent and both have independent increments, we conclude that $N(t)$ also has independent increments. Finally, consider an interval of length $\tau$, i.e, $I=(t,t+\tau]$. Then the numbers of arrivals in $I$ associated with $N_1(t)$ and $N_2(t)$ are $Poisson(\lambda_1 \tau)$ and $Poisson(\lambda_2 \tau)$ and they are independent. Therefore, the number of arrivals in $I$ associated with $N(t)$ is $Poisson\big((\lambda_1+\lambda_2) \tau\big)$ (sum of two independent Poisson random variables).
Merging Independent Poisson Processes

Let $N_1(t)$, $N_2(t)$, $\cdots$, $N_m(t)$ be $m$ independent Poisson processes with rates $\lambda_1$, $\lambda_2$, $\cdots$, $\lambda_m$. Let also \begin{align*} N(t)=N_1(t)+N_2(t)+\cdots+N_m(t), \quad \textrm{for all } t \in [0, \infty). \end{align*} Then, $N(t)$ is a Poisson process with rate $\lambda_1+\lambda_2+\cdots+\lambda_m$.

Splitting (Thinning) of Poisson Processes:

Here, we will talk about splitting a Poisson process into two independent Poisson processes. The idea will be better understood if we look at a concrete example.

Example
Suppose that the number of customers visiting a fast food restaurant in a given time interval $I$ is $N \sim Poisson(\mu)$. Assume that each customer purchases a drink with probability $p$, independently from other customers, and independently from the value of $N$. Let $X$ be the number of customers who purchase drinks in that time interval. Also, let $Y$ be the number of customers that do not purchase drinks; so $X+Y=N$.
  1. Find the marginal PMFs of $X$ and $Y$.
  2. Find the joint PMF of $X$ and $Y$.
  3. Are $X$ and $Y$ independent?
  • Solution
      1. First, note that $R_X=R_Y=\{0,1,2,...\}$. Also, given $N=n$, $X$ is a sum of $n$ independent $Bernoulli(p)$ random variables. Thus, given $N=n$, $X$ has a binomial distribution with parameters $n$ and $p$, so \begin{align}\label{} \nonumber &X|N=n \; \sim \; Binomial(n,p),\\ \nonumber &Y|N=n \; \sim \; Binomial(n,q=1-p). \end{align} We have \begin{align}\label{} \nonumber P_X(k)&=\sum_{n=0}^{\infty} P(X=k|N=n)P_N(n) & (\textrm{law of total probability})\\ \nonumber &=\sum_{n=k}^{\infty} {n \choose k} p^k q^{n-k} e^{-\mu} \frac{\mu^n}{n!}\\ \nonumber &=\sum_{n=k}^{\infty} \frac{p^k q^{n-k} e^{-\mu} \mu^n}{k! (n-k)!} \\ \nonumber &=\frac{e^{-\mu} (\mu p)^k}{k!} \sum_{n=k}^{\infty} \frac{(\mu q)^{n-k}}{(n-k)!} \\ \nonumber &=\frac{e^{-\mu} (\mu p)^k}{k!} e^{\mu q} & (\textrm{Taylor series for } e^x)\\ \nonumber &=\frac{e^{-\mu p} (\mu p)^k}{k!}, \hspace{40pt} \textrm{ for }k=0,1,2,... \end{align} Thus, we conclude that \begin{align}\label{} \nonumber &X \hspace{10pt} \sim \hspace{10pt} Poisson(\mu p). \end{align} Similarly, we obtain \begin{align}\label{} \nonumber &Y \hspace{10pt} \sim \hspace{10pt} Poisson(\mu q). \end{align}

      2. To find the joint PMF of $X$ and $Y$, we can also use the law of total probability: \begin{align}\label{} \nonumber P_{XY}(i,j)&=\sum_{n=0}^{\infty} P(X=i, Y=j|N=n)P_N(n) & (\textrm{law of total probability}). \end{align} However, note that $P(X=i, Y=j|N=n)=0$ if $N \neq i+j$, thus \begin{align}\label{} \nonumber P_{XY}(i,j)&=P(X=i, Y=j|N=i+j)P_N(i+j)\\ \nonumber &=P(X=i|N=i+j)P_N(i+j)\\ \nonumber &={i+j \choose i} p^i q^j e^{-\mu} \frac{\mu^{i+j}}{(i+j)!}\\ \nonumber &=\frac{e^{-\mu} (\mu p)^i (\mu q)^j}{i! j!}\\ \nonumber &=\frac{e^{-\mu p} (\mu p)^i}{i!}. \frac{e^{-\mu q} (\mu q)^j}{j!}\\ \nonumber &=P_X(i)P_Y(j). \end{align}

      3. $X$ and $Y$ are independent since, as we saw above, \begin{align}\label{} \nonumber P_{XY}(i,j)=P_X(i)P_Y(j). \end{align}


The above example was given for a specific interval $I$, in which a Poisson random variable $N$ was split to two independent Poisson random variables $X$ and $Y$. However, the argument can be used to show the same result for splitting a Poisson process to two independent Poisson processes. More specifically, we have the following result.
Splitting a Poisson Processes

Let $N(t)$ be a Poisson process with rate $\lambda$. Here, we divide $N(t)$ to two processes $N_1(t)$ and $N_2(t)$ in the following way (Figure 11.6). For each arrival, a coin with $P(H)=p$ is tossed. If the coin lands heads up, the arrival is sent to the first process ($N_1(t)$), otherwise it is sent to the second process. The coin tosses are independent of each other and are independent of $N(t)$. Then,
  1. $N_1(t)$ is a Poisson process with rate $\lambda p$;
  2. $N_2(t)$ is a Poisson process with rate $\lambda (1-p)$;
  3. $N_1(t)$ and $N_2(t)$ are independent.
Poisson-split
Figure 11.6 - Splitting a Poisson process to two independent Poisson processes.


The print version of the book is available on Amazon.

Book Cover


Practical uncertainty: Useful Ideas in Decision-Making, Risk, Randomness, & AI

ractical Uncertaintly Cover