9.2.0 End of Chapter Problems

Problem

Let $X$ be a continuous random variable with the following PDF \begin{align} \nonumber f_X(x) = \left\{ \begin{array}{l l} 6x(1-x) & \quad \textrm{if }0 \leq x \leq 1 \\ & \quad \\ 0 & \quad \text{otherwise} \end{array} \right. \end{align} Suppose that we know \begin{align} Y \; | \; X=x \quad \sim \quad Geometric(x). \end{align} Find the posterior density of $X$ given $Y=2$, $f_{X|Y}(x|2)$.




Problem

Let $X$ be a continuous random variable with the following PDF \begin{align} \nonumber f_X(x) = \left\{ \begin{array}{l l} 3x^2 & \quad \textrm{if }0 \leq x \leq 1 \\ & \quad \\ 0 & \quad \text{otherwise} \end{array} \right. \end{align} Also, suppose that \begin{align} Y \; | \; X=x \quad \sim \quad Geometric(x). \end{align} Find the MAP estimate of $X$ given $Y=5$.




Problem

Let $X$ and $Y$ be two jointly continuous random variables with joint PDF \begin{equation} \nonumber f_{XY}(x,y) = \left\{ \begin{array}{l l} x+\frac{3}{2} y^2 & \quad 0 \leq x,y \leq 1 \\ & \quad \\ 0 & \quad \text{otherwise}. \end{array} \right. \end{equation} Find the MAP and the ML estimates of $X$ given $Y=y$.




Problem

Let $X$ be a continuous random variable with the following PDF \begin{align} \nonumber f_X(x) = \left\{ \begin{array}{l l} 2x^2 + \frac{1}{3} & \quad \textrm{if }0 \leq x \leq 1 \\ & \quad \\ 0 & \quad \text{otherwise} \end{array} \right. \end{align} We also know that \begin{align} \nonumber f_{Y|X}(y|x) = \left\{ \begin{array}{l l} xy-\frac{x}{2}+1 & \quad \textrm{if }0 \leq y \leq 1 \\ & \quad \\ 0 & \quad \text{otherwise} \end{array} \right. \end{align} Find the MMSE estimate of $X$, given $Y=y$ is observed.




Problem

Let $X \sim N(0, 1)$ and \begin{align} Y=2X+W, \end{align} where $W \sim N(0, 1)$ is independent of $X$.

  1. Find the MMSE estimator of $X$ given $Y$, ($\hat{X}_M$).
  2. Find the MSE of this estimator, using $MSE=E[(X-\hat{X_M})^2]$.
  3. Check that $E[X^2]=E[\hat{X}^2_M]+E[\tilde{X}^2]$.




Problem

Suppose $X \sim Uniform(0,1)$, and given $X=x$, $Y \sim Exponential(\lambda=\frac{1}{2x})$.

  1. Find the linear MMSE estimate of $X$ given $Y$.
  2. Find the MSE of this estimator.
  3. Check that $E[\tilde{X} Y]=0$.




Problem

Suppose that the signal $X \sim N(0,\sigma^2_X)$ is transmitted over a communication channel. Assume that the received signal is given by \begin{align} Y=X+W, \end{align} where $W \sim N(0,\sigma_W^2)$ is independent of $X$.

  1. Find the MMSE estimator of $X$ given $Y$, ($\hat{X}_M$).
  2. Find the MSE of this estimator.




Problem

Let $X$ be an unobserved random variable with $EX=0$, $\textrm{Var}(X)=5$. Assume that we have observed $Y_1$ and $Y_2$ given by \begin{align} Y_1&=2X+W_1,\\ Y_2&=X+W_2, \end{align} where $EW_1=EW_2=0$, $\textrm{Var}(W_1)=2$, and $\textrm{Var}(W_2)=5$. Assume that $W_1$, $W_2$ , and $X$ are independent random variables. Find the linear MMSE estimator of $X$, given $Y_1$ and $Y_2$.




Problem

Consider again Problem 8, in which $X$ is an unobserved random variable with $EX=0$, $\textrm{Var}(X)=5$. Assume that we have observed $Y_1$ and $Y_2$ given by \begin{align} Y_1&=2X+W_1,\\ Y_2&=X+W_2, \end{align} where $EW_1=EW_2=0$, $\textrm{Var}(W_1)=2$, and $\textrm{Var}(W_2)=5$. Assume that $W_1$, $W_2$ , and $X$ are independent random variables. Find the linear MMSE estimator of $X$, given $Y_1$ and $Y_2$, using the vector formula \begin{align} \hat{\mathbf{X}}_L=\mathbf{\textbf{C}_\textbf{XY}} \mathbf{\textbf{C}_\textbf{Y}}^{-1} (\mathbf{Y}-E[\textbf{Y}])+ E[\textbf{X}]. \end{align}




Problem

Let $X$ be an unobserved random variable with $EX=0$, $\textrm{Var}(X)=5$. Assume that we have observed $Y_1$, $Y_2$, and $Y_3$ given by \begin{align} Y_1&=2X+W_1,\\ Y_2&=X+W_2, \\ Y_3&=X+2W_3, \end{align} where $EW_1=EW_2=EW_3=0$, $\textrm{Var}(W_1)=2$, $\textrm{Var}(W_2)=5$, and $\textrm{Var}(W_3)=3$. Assume that $W_1$, $W_2$, $W_3$, and $X$ are independent random variables. Find the linear MMSE estimator of $X$, given $Y_1$, $Y_2$, and $Y_3$.




Problem

Consider two random variables $X$ and $Y$ with the joint PMF given by the table below.

  $Y=0$ $Y=1$
$X=0$ $\frac{1}{7}$ $\frac{3}{7}$
$X=1$ $\frac{3}{7}$ $0$
  1. Find the linear MMSE estimator of $X$ given $Y$, ($\hat{X}_L$).
  2. Find the MMSE estimator of $X$ given $Y$, ($\hat{X}_M$).
  3. Find the MSE of $\hat{X}_M$.




Problem

Consider two random variables $X$ and $Y$ with the joint PMF given by the table below.

  $Y=0$ $Y=1$ $Y=2$
$X=0$ $\frac{1}{6}$ $\frac{1}{3}$ $0$
$X=1$ $\frac{1}{3}$ $0$ $\frac{1}{6}$
  1. Find the linear MMSE estimator of $X$ given $Y$, ($\hat{X}_L$).
  2. Find the MSE of $\hat{X}_L$.
  3. Find the MMSE estimator of $X$ given $Y$, ($\hat{X}_M$).
  4. Find the MSE of $\hat{X}_M$.




Problem

Suppose that the random variable $X$ is transmitted over a communication channel. Assume that the received signal is given by \begin{align} Y=2X+W, \end{align} where $W \sim N(0,\sigma^2)$ is independent of $X$. Suppose that $X=1$ with probability $p$, and $X=-1$ with probability $1-p$. The goal is to decide between $X=-1$ and $X=1$ by observing the random variable $Y$. Find the MAP test for this problem.




Problem

Find the average error probability in Problem 13.




Problem

A monitoring system is in charge of detecting malfunctioning machinery in a facility. There are two hypotheses to choose from:

$\quad$ $H_0$: There is not a malfunction,

$\quad$ $H_1$: There is a malfunction.

The system notifies a maintenance team if it accepts $H_1$. Suppose that, after processing the data, we obtain $P(H_1|y)=0.10$. Also, assume that the cost of missing a malfunction is 30 times the cost of a false alarm. Should the system alert a maintenance team (accept $H_1$)?




Problem

Let $X$ and $Y$ be jointly normal and $X \sim N(2,1)$, $Y \sim N(1,5)$, and $\rho(X,Y)=\frac{1}{4}$. Find a $90\%$ credible interval for $X$, given $Y=1$ is observed.




Problem

When the choice of a prior distribution is subjective, it is often advantageous to choose a prior distribution that will result in a posterior distribution of the same distributional family. When the prior and posterior distributions share the same distributional family, they are called conjugate distributions, and the prior is called a conjugate prior. Conjugate priors are used out of ease because they always result in a closed form posterior distribution. One example of this is to use a gamma prior for Poisson distributed data. Assume our data $Y$ given $X$ is distributed $Y \; | \; X=x \sim Poisson(\lambda = x)$ and we chose the prior to be $X \sim Gamma(\alpha,\beta).$ Then the PMF for our data is $$P_{Y|X}(y|x) = \frac{e^{-x}x^y}{y!}, \quad \text{for } x>0, y \in \{0,1, 2, \dots\},$$ and the PDF of the prior is given by $$f_X(x) = \frac{\beta^\alpha x^{\alpha-1}e^{-\beta x}}{\Gamma(\alpha)}, \quad \text{for } x>0, \; \alpha,\beta>0.$$

  1. Show that the posterior distribution is $Gamma(\alpha + y, \beta + 1)$.
  2. (Hint: Remove all the terms not containing x by putting them into some normalizing constant, c, and noting that $f_{X|Y}(x|y) \propto P_{Y|X}(y|x)f_X(x)$.)
  3. Write out the PDF for the posterior distribution, $f_{X|Y}(x|y)$.
  4. Find mean and variance of the posterior distribution, $E[X|Y]$ and $\textrm{Var}(X|Y)$.




Problem

Assume our data $Y$ given $X$ is distributed $Y \; | \; X=x \sim Binomial(n, p=x)$ and we chose the prior to be $X \sim Beta(\alpha,\beta).$ Then the PMF for our data is $$P_{Y|X}(y|x) = \binom{n}{y} x^y (1-x)^{n-y}, \quad \text{for } x \in [0,1], y \in \{0,1, \dots,n \},$$ and the PDF of the prior is given by $$f_X(x) = \frac{\Gamma(\alpha+\beta)}{\Gamma(\alpha)\Gamma(\beta)}x^{\alpha - 1}(1-x)^{\beta-1}, \quad \text{for } 0\leq x \leq 1, \alpha >0, \beta>0.$$ Note that, $EX = \frac{\alpha}{\alpha+\beta}$ and $\textrm{Var}(X) = \frac{\alpha \beta}{(\alpha+\beta)^2(\alpha+\beta+1)}.$

  1. Show that the posterior distribution is $Beta(\alpha + y, \beta + n - y)$.
  2. Write out the PDF for the posterior distribution, $f_{X|Y}(x|y)$.
  3. Find mean and variance of the posterior distribution, $E[X|Y]$ and $\textrm{Var}(X|Y)$.




Problem

Assume our data $Y$ given $X$ is distributed $Y \; | \; X=x \sim Geometric(p=x)$ and we chose the prior to be $X \sim Beta(\alpha,\beta).$ Refer to Problem 18 for the PDF and moments of the $Beta$ distribution.

  1. Show that the posterior distribution is $Beta(\alpha + 1, \beta + y - 1)$.
  2. Write out the PDF for the posterior distribution, $f_{X|Y}(x|y)$.
  3. Find mean and variance of the posterior distribution, $E[X|Y]$ and $\textrm{Var}(X|Y)$.




Problem

Assume our data $\textbf{Y} = (y_1, y_2, \dots, y_n)^T$ given $X$ is independently identically distributed, $\textbf{Y} \; | \; X=x \stackrel{i.i.d.}{\sim} Exponential(\lambda=x)$, and we chose the prior to be $X \sim Gamma(\alpha,\beta).$

  1. Find the likelihood of the function, $L(\textbf{Y}; X) =f_{Y_1, Y_2, \dots, Y_n|X}(y_1, y_2, \dots, y_n|x)$.
  2. Using the likelihood function of the data, show that the posterior distribution is $Gamma(\alpha + n, \beta + \sum_{i=1}^n y_i)$.
  3. Write out the PDF for the posterior distribution, $f_{X|\textbf{Y}}(x|\textbf{y})$.
  4. Find mean and variance of the posterior distribution, $E[X|\textbf{Y}]$ and $\textrm{Var}(X|\textbf{Y})$.






The print version of the book is available on Amazon.

Book Cover


Practical uncertainty: Useful Ideas in Decision-Making, Risk, Randomness, & AI

ractical Uncertaintly Cover