10.1.3 Multiple Random Processes

We often need to study more than one random process. For example, when investing in the stock market you consider several different stocks and you are interested in how they are related. In particular, you might be interested in finding out whether two stocks are positively or negatively correlated. A useful idea in these situations is to look at cross-correlation and cross-covariance functions.
For two random processes $\big\{X(t), t \in J \big\}$ and $\big\{Y(t), t \in J \big\}$:
$-$ the cross-correlation function $R_{XY}(t_1,t_2)$, is defined by \begin{align} R_{XY}(t_1,t_2)=E[X(t_1)Y(t_2)], \quad \textrm{for }t_1,t_2 \in J; \end{align}
$-$ the cross-covariance function $C_{XY}(t_1,t_2)$, is defined by \begin{align} C_{XY}(t_1,t_2)&=\textrm{Cov}\big(X(t_1),Y(t_2)\big)\\ &=R_{XY}(t_1,t_2)-\mu_X(t_1) \mu_Y(t_2), \quad \textrm{for }t_1,t_2 \in J. \end{align}
To get an idea about these concepts suppose that $X(t)$ is the price of oil (per gallon) and $Y(t)$ is the price of gasoline (per gallon) at time $t$. Since gasoline is produced from oil, as oil prices increase, the gasoline prices tend to increase, too. Thus, we conclude that $X(t)$ and $Y(t)$ should be positively correlated (at least for the same $t$, i.e., $C_{XY}(t,t)>0$).

Example
Let $A$, $B$, and $C$ be independent normal $N(1,1)$ random variables. Let $\big\{X(t), t \in [0,\infty) \big\}$ be defined as \begin{align} X(t)=A+Bt, \quad \textrm{ for all }t \in [0,\infty). \end{align} Also, let $\big\{Y(t), t \in [0,\infty) \big\}$ be defined as \begin{align}%\label{} Y(t)=A+Ct, \quad \textrm{ for all }t \in [0,\infty). \end{align} Find $R_{XY}(t_1,t_2)$ and $C_{XY}(t_1,t_2)$, for $t_1,t_2 \in [0,\infty)$.
  • Solution
    • First, note that \begin{align}%\label{} \mu_X(t)&=E[X(t)]\\ & =EA+EB \cdot t\\ &=1+t, \quad \textrm{ for all }t \in [0,\infty). \end{align} Similarly, \begin{align}%\label{} \mu_Y(t)&=E[Y(t)]\\ & =EA+EC \cdot t\\ &=1+t, \quad \textrm{ for all }t \in [0,\infty). \end{align} To find $R_{XY}(t_1,t_2)$ for $t_1,t_2 \in [0,\infty)$, we write \begin{align}%\label{} R_{XY}(t_1,t_2)&=E[X(t_1)Y(t_2)]\\ &=E \big[(A+Bt_1)(A+Ct_2)\big] \\ &=E \big[A^2+ACt_2+BAt_1+BCt_1t_2\big]\\ &=E[A^2]+E[AC]t_2+E[BA]t_1+E[BC] t_1t_2\\ &=E[A^2]+E[A]E[C]t_2+E[B]E[A]t_1+E[B]E[C] t_1t_2, \quad (\textrm{by independence})\\ &=2+t_1+t_2+t_1t_2. \end{align} To find $C_{XY}(t_1,t_2)$ for $t_1,t_2 \in [0,\infty)$, we write \begin{align}%\label{} C_{XY}(t_1,t_2)&=R_{XY}(t_1,t_2)-\mu_X(t_1) \mu_Y(t_2)\\ &=\big(2+t_1+t_2+t_1t_2\big)-\big(1+t_1\big)\big(1+t_2\big)\\ &=1. \end{align}


Independent Random Processes:

We have seen independence for random variables. In particular, remember that random variables $X_1$, $X_2$,...,$X_n$ are independent if, for all $(x_1,x_2,...,x_n) \in \mathbb{R}^n$, we have \begin{align}%\label{} \nonumber F_{X_1,X_2,...,X_n}(x_1,x_2,...,x_n)= F_{X_1}(x_1)F_{X_2}(x_2)...F_{X_n}(x_n). \end{align} Now, note that a random process is a collection of random variables. Thus, we can define the concept of independence for random processes, too. In particular, if for two random processes $X(t)$ and $Y(t)$, the random variables $X(t_i)$ are independent from the random variables $Y(t_j)$, we say that the two random processes are independent. More precisely, we have the following definition:
Two random processes $\big\{X(t), t \in J \big\}$ and $\big\{Y(t), t \in J' \big\}$ are said to be independent if, for all \begin{align}%\label{} & t_1,t_2, \dots, t_m \in J\\ & \quad \quad \textrm{and}\\ & t'_1,t'_2, \dots, t'_n \in J', \end{align} the set of random variables \begin{align}%\label{} & X(t_1), X(t_2), \cdots, X(t_m) \end{align} are independent of the set of random variables \begin{align} & Y(t'_1), Y(t'_2), \cdots, Y(t'_n). \end{align}
The above definition implies that for all real numbers $x_1,x_2,\cdots, x_m$ and $y_1,y_2,\cdots, y_n$, we have \begin{align}%\label{} \nonumber &F_{X(t_1), X(t_2), \cdots, X(t_m), Y(t'_1), Y(t'_2), \cdots, Y(t'_n)}(x_1,x_2,\cdots,x_m,y_1,y_2,\cdots, y_n) \\ & \quad = F_{X(t_1), X(t_2), \cdots, X(t_m)}(x_1,x_2,\cdots,x_m) \cdot F_{Y(t'_1), Y(t'_2), \cdots, Y(t'_n)}(y_1,y_2,\cdots, y_n). \end{align} The above equation might seem complicated; however, in many real-life applications we can often argue that two random processes are independent by looking at the problem structure. For example, in engineering we can reasonably assume that the thermal noise processes in two separate systems are independent. Note that if two random processes $X(t)$ and $Y(t)$ are independent, then their covariance function, $C_{XY}(t_1,t_2)$, for all $t_1$ and $t_2$ is given by \begin{align}%\label{} C_{XY}(t_1,t_2)&=\textrm{Cov}\big(X(t_1),Y(t_2)\big)\\ &=0 \quad (\textrm{since $X(t_1)$ and $Y(t_2)$ are independent}). \end{align}


The print version of the book is available on Amazon.

Book Cover


Practical uncertainty: Useful Ideas in Decision-Making, Risk, Randomness, & AI

ractical Uncertaintly Cover