11.1.3 Merging and Splitting Poisson Processes
Merging Independent Poisson Processes:
Let $N_1(t)$ and $N_2(t)$ be two independent Poisson processes with rates $\lambda_1$ and $\lambda_2$ respectively. Let us define $N(t)=N_1(t)+N_2(t)$. That is, the random process $N(t)$ is obtained by combining the arrivals in $N_1(t)$ and $N_2(t)$ (Figure 11.5). We claim that $N(t)$ is a Poisson process with rate $\lambda=\lambda_1+\lambda_2$. To see this, first note that \begin{align*} N(0) &=N_1(0)+N_2(0) \\ &=0+0=0. \end{align*}Next, since $N_1(t)$ and $N_2(t)$ are independent and both have independent increments, we conclude that $N(t)$ also has independent increments. Finally, consider an interval of length $\tau$, i.e, $I=(t,t+\tau]$. Then the numbers of arrivals in $I$ associated with $N_1(t)$ and $N_2(t)$ are $Poisson(\lambda_1 \tau)$ and $Poisson(\lambda_2 \tau)$ and they are independent. Therefore, the number of arrivals in $I$ associated with $N(t)$ is $Poisson\big((\lambda_1+\lambda_2) \tau\big)$ (sum of two independent Poisson random variables).
Let $N_1(t)$, $N_2(t)$, $\cdots$, $N_m(t)$ be $m$ independent Poisson processes with rates $\lambda_1$, $\lambda_2$, $\cdots$, $\lambda_m$. Let also \begin{align*} N(t)=N_1(t)+N_2(t)+\cdots+N_m(t), \quad \textrm{for all } t \in [0, \infty). \end{align*} Then, $N(t)$ is a Poisson process with rate $\lambda_1+\lambda_2+\cdots+\lambda_m$.
Splitting (Thinning) of Poisson Processes:
Here, we will talk about splitting a Poisson process into two independent Poisson processes. The idea will be better understood if we look at a concrete example.Example
Suppose that the number of customers visiting a fast food restaurant in a given time interval $I$ is $N \sim Poisson(\mu)$. Assume that each customer purchases a drink with probability $p$, independently from other customers, and independently from the value of $N$. Let $X$ be the number of customers who purchase drinks in that time interval. Also, let $Y$ be the number of customers that do not purchase drinks; so $X+Y=N$.
 Find the marginal PMFs of $X$ and $Y$.
 Find the joint PMF of $X$ and $Y$.
 Are $X$ and $Y$ independent?
 Solution

 First, note that $R_X=R_Y=\{0,1,2,...\}$. Also, given $N=n$, $X$ is a sum of $n$ independent $Bernoulli(p)$ random variables. Thus, given $N=n$, $X$ has a binomial distribution with parameters $n$ and $p$, so \begin{align}\label{} \nonumber &XN=n \; \sim \; Binomial(n,p),\\ \nonumber &YN=n \; \sim \; Binomial(n,q=1p). \end{align} We have \begin{align}\label{} \nonumber P_X(k)&=\sum_{n=0}^{\infty} P(X=kN=n)P_N(n) & (\textrm{law of total probability})\\ \nonumber &=\sum_{n=k}^{\infty} {n \choose k} p^k q^{nk} e^{\mu} \frac{\mu^n}{n!}\\ \nonumber &=\sum_{n=k}^{\infty} \frac{p^k q^{nk} e^{\mu} \mu^n}{k! (nk)!} \\ \nonumber &=\frac{e^{\mu} (\mu p)^k}{k!} \sum_{n=k}^{\infty} \frac{(\mu q)^{nk}}{(nk)!} \\ \nonumber &=\frac{e^{\mu} (\mu p)^k}{k!} e^{\mu q} & (\textrm{Taylor series for } e^x)\\ \nonumber &=\frac{e^{\mu p} (\mu p)^k}{k!}, \hspace{40pt} \textrm{ for }k=0,1,2,... \end{align} Thus, we conclude that \begin{align}\label{} \nonumber &X \hspace{10pt} \sim \hspace{10pt} Poisson(\mu p). \end{align} Similarly, we obtain \begin{align}\label{} \nonumber &Y \hspace{10pt} \sim \hspace{10pt} Poisson(\mu q). \end{align}
 To find the joint PMF of $X$ and $Y$, we can also use the law of total probability: \begin{align}\label{} \nonumber P_{XY}(i,j)&=\sum_{n=0}^{\infty} P(X=i, Y=jN=n)P_N(n) & (\textrm{law of total probability}). \end{align} However, note that $P(X=i, Y=jN=n)=0$ if $N \neq i+j$, thus \begin{align}\label{} \nonumber P_{XY}(i,j)&=P(X=i, Y=jN=i+j)P_N(i+j)\\ \nonumber &=P(X=iN=i+j)P_N(i+j)\\ \nonumber &={i+j \choose i} p^i q^j e^{\mu} \frac{\mu^{i+j}}{(i+j)!}\\ \nonumber &=\frac{e^{\mu} (\mu p)^i (\mu q)^j}{i! j!}\\ \nonumber &=\frac{e^{\mu p} (\mu p)^i}{i!}. \frac{e^{\mu q} (\mu q)^j}{j!}\\ \nonumber &=P_X(i)P_Y(j). \end{align}
 $X$ and $Y$ are independent since, as we saw above, \begin{align}\label{} \nonumber P_{XY}(i,j)=P_X(i)P_Y(j). \end{align}

The above example was given for a specific interval $I$, in which a Poisson random variable $N$ was split to two independent Poisson random variables $X$ and $Y$. However, the argument can be used to show the same result for splitting a Poisson process to two independent Poisson processes. More specifically, we have the following result.
Let $N(t)$ be a Poisson process with rate $\lambda$. Here, we divide $N(t)$ to two processes $N_1(t)$ and $N_2(t)$ in the following way (Figure 11.6). For each arrival, a coin with $P(H)=p$ is tossed. If the coin lands heads up, the arrival is sent to the first process ($N_1(t)$), otherwise it is sent to the second process. The coin tosses are independent of each other and are independent of $N(t)$. Then,
 $N_1(t)$ is a Poisson process with rate $\lambda p$;
 $N_2(t)$ is a Poisson process with rate $\lambda (1p)$;
 $N_1(t)$ and $N_2(t)$ are independent.