4.2.6 Solved Problems:
Special Continuous Distributions

Problem

Suppose the number of customers arriving at a store obeys a Poisson distribution with an average of $\lambda$ customers per unit time. That is, if $Y$ is the number of customers arriving in an interval of length $t$, then $Y \sim Poisson (\lambda t)$. Suppose that the store opens at time $t=0$. Let $X$ be the arrival time of the first customer. Show that $X \sim Exponential(\lambda)$.

  • Solution
    • We first find $P(X > t)$:

      $P(X > t)$ $=P(\textrm{No arrival in }[0,t])$
      $=e^{-\lambda t}\frac{(\lambda t)^0}{0!}$
      $=e^{-\lambda t}$.

      Thus, the CDF of $X$ for $x>0$ is given by $$F_X(x)=1-P(X>x)=1-e^{-\lambda x},$$ which is the CDF of $Exponential(\lambda)$. Note that by the same argument, the time between the first and second customer also has $Exponential(\lambda)$ distribution. In general, the time between the $k$'th and $k+1$'th customer is $Exponential(\lambda)$.



Problem (Exponential as the limit of Geometric)

Let $Y \sim Geometric(p)$, where $p=\lambda \Delta$. Define $X=Y \Delta$, where $\lambda, \Delta>0$. Prove that for any $x \in (0,\infty)$, we have $$\lim_{\Delta \rightarrow 0} F_X(x)=1-e^{-\lambda x}.$$

  • Solution
    • If $Y \sim Geometric(p)$ and $q=1-p$, then

      $P(Y \leq n)$ $=\sum_{k=1}^{n} pq^{k-1}$
      $=p. \frac{1-q^{\large{n}}}{1-q}=1-(1-p)^n$.

      Then for any $y \in (0,\infty)$, we can write $$P(Y \leq y) =1-(1-p)^{\lfloor y\rfloor},$$ where $\lfloor y\rfloor$ is the largest integer less than or equal to $y$. Now, since $X=Y \Delta$, we have
      $F_X(x)$ $=P(X \leq x)$
      $=P\left(Y \leq \frac{x}{\Delta}\right)$
      $=1-(1-p)^{\lfloor \frac{\Large{x}}{\Delta} \rfloor}=1-(1-\lambda \Delta)^{\lfloor \frac{\Large{x}}{\Delta} \rfloor}$.

      Now, we have
      $\lim_{\Delta \rightarrow 0} F_X(x)$ $=\lim_{\Delta \rightarrow 0} 1-(1-\lambda \Delta)^{\lfloor \frac{\Large{x}}{\Delta} \rfloor}$
      $=1-\lim_{\Delta \rightarrow 0} (1-\lambda \Delta)^{\lfloor \frac{\Large{x}}{\Delta} \rfloor}$
      $=1-e^{-\lambda x}$.

      The last equality holds because $\frac{x}{\Delta}-1 \leq \lfloor \frac{x}{\Delta} \rfloor \leq \frac{x}{\Delta}$, and we know $$\lim_{\Delta \rightarrow 0^{+}} (1-\lambda \Delta)^{\frac{1}{\Delta}}=e^{-\lambda}.$$



Problem

Let $U \sim Uniform(0,1)$ and $X=-\ln (1-U)$. Show that $X \sim Exponential(1)$.

  • Solution
    • First note that since $R_U=(0,1)$, $R_X=(0,\infty)$. We will find the CDF of $X$. For $x \in(0,\infty)$, we have

      $F_X(x)$ $=P(X \leq x)$
      $=P(-\ln (1-U) \leq x)$
      $=P\left(\frac{1}{1-U} \leq e^x\right)$
      $=P(U \leq 1-e^{-x})=1-e^{-x}$,

      which is the CDF of an $Exponential(1)$ random variable.



Problem

Let $X \sim N(2,4)$ and $Y=3-2X$.

  1. Find $P(X > 1)$.
  2. Find $P(-2 < Y < 1)$.
  3. Find $P(X > 2 |Y < 1)$.

  • Solution
      1. Find $P(X > 1)$: We have $\mu_X=2$ and $\sigma_X=2$. Thus,
        $P(X > 1)$ $=1-\Phi\left(\frac{1-2}{2}\right)$
        $=1-\Phi(-0.5)=\Phi(0.5)=0.6915$

      2. Find $P(-2 < Y < 1)$: Since $Y=3-2X$, using Theorem 4.3, we have $Y \sim N(-1,16)$. Therefore,
        $P(-2 < Y < 1)$ $=\Phi\left(\frac{1-(-1)}{4}\right)-\Phi\left(\frac{(-2)-(-1)}{4}\right)$
        $=\Phi(0.5)-\Phi(-0.25)=0.29$

      3. Find $P(X > 2 |Y < 1)$:
        $P(X > 2 |Y < 1)$ $=P(X > 2 |3-2X < 1)$
        $=P(X > 2 |X > 1)$
        $=\frac{P(X > 2,X > 1)}{P(X > 1)}$
        $=\frac{P(X > 2)}{P(X > 1)}$
        $=\frac{1-\Phi(\frac{2-2}{2})}{1-\Phi(\frac{1-2}{2})}$
        $=\frac{1-\Phi(0)}{1-\Phi(-0.5)}$
        $\approx 0.72$



Problem

Let $X \sim N(0,\sigma^2)$. Find $E|X|$

  • Solution
    • We can write $X=\sigma Z$, where $Z \sim N(0,1)$. Thus, $E|X|=\sigma E|Z|$. We have

      $E|Z|$ $=\frac{1}{\sqrt{2\pi}}\int_{-\infty}^{\infty} |t| e^{-\frac{t^2}{2}}dt$
      $=\frac{2}{\sqrt{2\pi}}\int_{0}^{\infty} |t| e^{-\frac{t^2}{2}}dt \hspace{20pt}(\textrm{integral of an even function})$
      $=\sqrt{\frac{2}{\pi}}\int_{0}^{\infty} t e^{-\frac{t^2}{2}}dt$
      $=\sqrt{\frac{2}{\pi}}\bigg[-e^{-\frac{t^2}{2}} \bigg]_{0}^{\infty}=\sqrt{\frac{2}{\pi}}$

      Thus, we conclude $E|X|=\sigma E|Z|=\sigma\sqrt{\frac{2}{\pi}}$.



Problem

Show that the constant in the normal distribution must be $\frac{1}{\sqrt{2 \pi}}$. That is, show that $$I=\int_{-\infty}^{\infty} e^{-\frac{x^2}{2}}dx=\sqrt{2 \pi}.$$ Hint: Write $I^2$ as a double integral in polar coordinates.

  • Solution
    • Let $I=\int_{-\infty}^{\infty} e^{-\frac{x^2}{2}}dx$. We show that $I^2=2\pi$. To see this, note

      $I^2$ $= \int_{-\infty}^{\infty} e^{-\frac{x^2}{2}}dx \int_{-\infty}^{\infty} e^{-\frac{y^2}{2}}dy$
      $=\int_{-\infty}^{\infty} \int_{-\infty}^{\infty} e^{-\frac{x^2+y^2}{2}}dxdy$.

      To evaluate this double integral we can switch to polar coordinates. This can be done by change of variables $x=r \cos \theta, y=r \sin \theta$, and $dx dy=rdrd\theta$. In particular, we have
      $I^2$ $=\int_{-\infty}^{\infty} \int_{-\infty}^{\infty} e^{-\frac{x^2+y^2}{2}}dxdy$
      $=\int_{0}^{\infty} \int_{0}^{2\pi} e^{-\frac{r^2}{2}}r d\theta dr$
      $=2 \pi \int_{0}^{\infty} r e^{-\frac{r^2}{2}} dr$
      $=2 \pi \bigg[-e^{-\frac{r^2}{2}}\bigg]_{0}^{\infty}=2 \pi$.



Problem

Let $Z \sim N(0,1)$. Prove for all $x \geq 0$, $$\frac{1}{\sqrt{2\pi}} \frac{x}{x^2+1} e^{-\frac{x^2}{2}} \leq P(Z \geq x) \leq \frac{1}{\sqrt{2\pi}} \frac{1}{x} e^{-\frac{x^2}{2}}.$$

  • Solution
    • To show the upper bound, we can write

      $P(Z \geq x)$ $= \frac{1}{\sqrt{2 \pi}} \int_{x}^{\infty}e^{-\frac{u^2}{2}} du$
      $\leq \frac{1}{\sqrt{2 \pi}} \int_{x}^{\infty} \frac{u}{x} e^{-\frac{u^2}{2}} du \hspace{10pt} (\textrm{since $u \geq x > 0$})$
      $= \frac{1}{\sqrt{2 \pi}}\frac{1}{x} \left[-e^{-\frac{u^2}{2}}\right]^{\infty}_{x}$
      $=\frac{1}{\sqrt{2\pi}} \frac{1}{x} e^{-\frac{x^2}{2}}$.

      To show the lower bound, let $Q(x)=P(Z \geq x)$, and $$h(x)=Q(x)-\frac{1}{\sqrt{2\pi}} \frac{x}{x^2+1} e^{-\frac{x^2}{2}}, \hspace{10pt} \textrm{ for all }x \geq 0.$$ It suffices to show that $$h(x)\geq 0, \hspace{10pt} \textrm{ for all }x \geq 0.$$ To see this, note that the function $h$ has the following properties
      1. $h(0)=\frac{1}{2}$;
      2. $\lim \limits_{x \rightarrow \infty} h(x)=0$;
      3. $h'(x)=-\frac{2}{\sqrt{2\pi}}\left( \frac{e^{-\frac{x^2}{2}}}{(x^2+1)^2}\right) < 0$, for all $x \geq 0$.
      Therefore, $h(x)$ is a strictly decreasing function that starts at $h(0)=\frac{1}{2}$ and decreases as $x$ increases. It approaches $0$ as $x$ goes to infinity. We conclude that $h(x) \geq 0$, for all $x \geq 0$.


Problem

Let $X \sim Gamma(\alpha,\lambda)$, where $\alpha, \lambda \gt 0$. Find $EX$, and $Var(X)$.

  • Solution
    • To find $EX$ we can write $$ \begin{align*} EX &= \int_0^\infty x f_X(x) dx \\ &= \int_0^\infty x \cdot \frac{\lambda^{\alpha}}{\Gamma{\alpha}} x^{\alpha - 1} e^{-\lambda x} {\rm d}x \\ &= \frac{\lambda^{\alpha}}{\Gamma(\alpha)} \int_0^{\infty} x \cdot x^{\alpha - 1} e^{-\lambda x} {\rm d}x \\ &= \frac{\lambda^{\alpha}}{\Gamma(\alpha)} \int_0^{\infty} x^{\alpha} e^{-\lambda x} {\rm d}x \\ &= \frac{\lambda^{\alpha}}{\Gamma(\alpha)} \frac{\Gamma(\alpha + 1)}{\lambda^{\alpha + 1}} &\textrm{(using Property 2 of the gamma function)} \\ &= \frac{\alpha\Gamma(\alpha)}{\lambda\Gamma(\alpha)} &\textrm{(using Property 3 of the gamma function)} \\ &= \frac{\alpha}{\lambda}. \end{align*} $$

      Similarly, we can find $EX^2$: $$ \begin{align*} EX^2 &= \int_0^{\infty} x^2 {\rm d}x \\ &= \int_0^{\infty} x^2 \cdot \frac{\lambda^{\alpha}}{\Gamma(\alpha)} x^{\alpha - 1} e^{-\lambda x} {\rm d}x \\ &= \frac{\lambda^{\alpha}}{\Gamma(\alpha)} \int_0^{\infty} x^2 \cdot x^{\alpha - 1} e^{-\lambda x} {\rm d}x \\ &= \frac{\lambda^{\alpha}}{\Gamma(\alpha)} \int_0^{\infty} x^{\alpha + 1} e^{-\lambda x} {\rm d}x \\ &= \frac{\lambda^{\alpha}}{\Gamma(\alpha)} \frac{\Gamma(\alpha + 2)}{\lambda^{\alpha + 2}} &\textrm{(using Property 2 of the gamma function)} \\ &= \frac{(\alpha + 1)\Gamma(\alpha + 1)}{\lambda^2 \Gamma(\alpha)} &\textrm{(using Property 3 of the gamma function)} \\ &= \frac{(\alpha + 1) \alpha \Gamma(\alpha)}{\lambda^2 \Gamma(\alpha)} &\textrm{(using Property 3 of the gamma function)} \\ &= \frac{\alpha (\alpha + 1)}{\lambda^2}. \end{align*} $$

      So, we conclude $$ \begin{align*} Var(X) &= EX^2 - (EX)^2 \\ &= \frac{\alpha (\alpha + 1)}{\lambda^2} - \frac{\alpha^2}{\lambda^2} \\ &= \frac{\alpha}{\lambda^2}. \end{align*} $$




The print version of the book is available on Amazon.

Book Cover


Practical uncertainty: Useful Ideas in Decision-Making, Risk, Randomness, & AI

ractical Uncertaintly Cover