10.2.3 Power in a Frequency Band

Here, we would like show that if you integrate $S_X(f)$ over a frequency range, you will obtain the expected power in $X(t)$ in that frequency range. Let's first define what we mean by the expected power "in a frequency range." Consider a WSS random process $X(t)$ that goes through an LTI system with the following transfer function (Figure 10.7): \begin{align*} H(f)=\left\{ \begin{array}{l l} 1 & \quad f_1 \lt |f| \lt f_2 \\ & \quad \\ 0 & \quad \text{otherwise} \end{array} \right. \end{align*}
bandpass-filter
Figure 10.7 - A bandpass filter.

This is in fact a bandpass filter. This filter eliminates every frequency outside of the frequency band $f_1 \lt |f| \lt f_2$. Thus, the resulting random process $Y(t)$ is a filtered version of $X(t)$ in which frequency components in the frequency band $f_1 \lt |f| \lt f_2$ are preserved. The expected power in $Y(t)$ is said to be the expected power in $X(t)$ in the frequency range $f_1 \lt |f| \lt f_2$.

Now, let's find the expected power in $Y(t)$. We have

\begin{align*} S_Y(f)=S_X(f)|H(f)|^2=\left\{ \begin{array}{l l} S_X(f) & \quad f_1 \lt |f| \lt f_2 \\ & \quad \\ 0 & \quad \text{otherwise} \end{array} \right. \end{align*} Thus, the power in $Y(t)$ is \begin{align*} E[Y(t)^2] & =\int_{-\infty}^{\infty} S_Y(f) \; df \\ &=\int_{-f_2}^{-f_1} S_X(f) \; df+\int_{f_1}^{f_2} S_X(f) \; df\\ &=2\int_{f_1}^{f_2} S_X(f) \; df \;b \big(\textrm{since }S_X(-f)=S_X(f)\big) \end{align*} Therefore, we conclude that, if we integrate $S_X(f)$ over the frequency range $f_1 \lt |f| \lt f_2$, we will obtain the expected power in $X(t)$ in that frequency range. That is why $S_X(f)$ is called the power spectral density of $X(t)$.

Gaussian Processes through LTI Systems:

Let $X(t)$ be a stationary Gaussian random process that goes through an LTI system with impulse response $h(t)$. Then, the output process is given by \begin{align*} Y(t)&=h(t)\ast X(t)\\ &=\int_{-\infty}^{\infty} h(\alpha)X(t-\alpha) \; d\alpha. \end{align*} For each $t$, you can think of the above integral as a limit of a sum. Now, since the different sums of jointly normal random variables are also jointly normal, you can argue that $Y(t)$ is also a Gaussian random process. Indeed, we can conclude that $X(t)$ and $Y(t)$ are jointly normal. Note that, for Gaussian processes, stationarity and wide-sense stationarity are equal.
Let $X(t)$ be a stationary Gaussian process. If $X(t)$ is the input to an LTI system, then the output random process, $Y(t)$, is also a stationary Gaussian process. Moreover, $X(t)$ and $Y(t)$ are jointly Gaussian.


Example
Let $X(t)$ be a zero-mean Gaussian random process with $R_X(\tau)=8 \; \textrm{sinc}(4 \tau)$. Suppose that $X(t)$ is input to an LTI system with transfer function \begin{align*} H(f)=\left\{ \begin{array}{l l} \frac{1}{2} & \quad |f| \lt 1 \\ & \quad \\ 0 & \quad \text{otherwise} \end{array} \right. \end{align*} If $Y(t)$ is the output, find $P(Y(2) \lt 1|Y(1)=1)$.
  • Solution
    • Since $X(t)$ is a WSS Gaussian process, $Y(t)$ is also a WSS Gaussian process. Thus, it suffices to find $\mu_Y$ and $R_Y(\tau)$. Since $\mu_X=0$, we have \begin{align*} \mu_Y=\mu_X H(0)=0. \end{align*} Also, note that \begin{align*} S_X(f)&=\mathcal{F} \{R_X(\tau)\}\\ &=\left\{ \begin{array}{l l} 2 & \quad |f| \lt 2 \\ & \quad \\ 0 & \quad \text{otherwise} \end{array} \right. \end{align*} We can then find $S_Y(f)$ as \begin{align*} S_Y(f)&=S_X(f)|H(f)|^2\\ &=\left\{ \begin{array}{l l} \frac{1}{2} & \quad |f| \lt 1 \\ & \quad \\ 0 & \quad \text{otherwise} \end{array} \right. \end{align*} Thus, $R_Y(\tau)$ is given by \begin{align*} R_Y(\tau)&=\mathcal{F}^{-1} \{S_X(f)\}\\ &=\textrm{sinc}(2 \tau). \end{align*} Therefore, \begin{align*} E[Y(t)^2]=R_Y(0)=1. \end{align*} We conclude that $Y(t) \sim N(0,1)$, for all $t$. Since $Y(1)$ and $Y(2)$ are jointly Gaussian, to determine their joint PDF, it only remains to find their covariance. We have \begin{align*} E[Y(1)Y(2)]&=R_Y(-1)\\ &=\textrm{sinc}(-2)\\ &=\frac{\sin (-2 \pi)}{-2 \pi}\\ &=0. \end{align*} Since $E[Y(1)]=E[Y(2)]=0$, we conclude that $Y(1)$ and $Y(2)$ are uncorrelated. Since $Y(1)$ and $Y(2)$ are jointly normal, we conclude that they are independent, so \begin{align*} P(Y(2) \lt 1|Y(1)=1)&=P(Y(2) \lt 1)\\ &=\Phi(1) \approx 0.84 \end{align*}




The print version of the book is available on Amazon.

Book Cover


Practical uncertainty: Useful Ideas in Decision-Making, Risk, Randomness, & AI

ractical Uncertaintly Cover