## 9.1.4 Conditional Expectation (MMSE)

*mean squared error*. For this reason, the conditional expectation is called the

*minimum mean squared error (MMSE) estimate*of $X$. It is also called the

*least mean squares (LMS) estimate*or simply the

*Bayes' estimate*of $X$.

__Minimum Mean Squared Error (MMSE) Estimation__

The

**minimum mean squared error (MMSE)**estimate of the random variable $X$, given that we have observed $Y=y$, is given by \begin{align} \hat{x}_{M}=E[X|Y=y]. \end{align}

Example

Let $X$ be a continuous random variable with the following PDF \begin{align} \nonumber f_X(x) = \left\{ \begin{array}{l l} 2x & \quad \textrm{if }0 \leq x \leq 1 \\ & \quad \\ 0 & \quad \text{otherwise} \end{array} \right. \end{align} We also know that \begin{align} \nonumber f_{Y|X}(y|x) = \left\{ \begin{array}{l l} 2xy-x+1 & \quad \textrm{if }0 \leq y \leq 1 \\ & \quad \\ 0 & \quad \text{otherwise} \end{array} \right. \end{align} Find the MMSE estimate of $X$, given $Y=y$ is observed.

**Solution**- First we need to find the posterior density, $f_{X|Y}(x|y)$. We have \begin{align} f_{X|Y}(x|y)=\frac{f_{Y|X}(y|x)f_{X}(x)}{f_{Y}(y)}. \end{align} We can find $f_{Y}(y)$ as \begin{align} f_Y(y)&=\int_{0}^{1} f_{Y|X}(y|x)f_{X}(x) dx\\ &=\int_{0}^{1} (2xy-x+1) 2x dx\\ &=\frac{4}{3}y+\frac{1}{3}, \quad \textrm{ for }0 \leq y \leq 1. \end{align} We conclude \begin{align} f_{X|Y}(x|y)=\frac{6x(2xy-x+1)}{4y+1}, \quad \textrm{ for }0 \leq x \leq 1. \end{align} The MMSE estimate of $X$ given $Y=y$ is then given by \begin{align} \hat{x}_{M}&=E[X|Y=y]\\ &=\int_{0}^{1} x f_{X|Y}(x|y) dx \\ &=\frac{1}{4y+1} \int_{0}^{1} 6x^2(2xy-x+1) dx\\ &=\frac{3y+ \frac{1}{2}}{4y+1}. \end{align}