10.1.0 Basic Concepts

In real-life applications, we are often interested in multiple observations of random values over a period of time. For example, suppose that you are observing the stock price of a company over the next few months. In particular, let $S(t)$ be the stock price at time $t\in[0,\infty)$. Here, we assume $t=0$ refers to current time. Figure 10.1 shows a possible outcome of this random experiment from time $t=0$ to time $t=1$.
stock-rp_b
Figure 10.1 - A possible realization of values of a stock observed as a function of time. Here, $S(t)$ is an example of a random process.
Note that at any fixed time $t_1 \in [0,\infty)$, $S(t_1)$ is a random variable. Based on your knowledge of finance and the historical data, you might be able to provide a PDF for $S(t_1)$. If you choose another time $t_2\in [0,\infty)$, you obtain another random variable $S(t_2)$ that could potentially have a different PDF. When we consider the values of $S(t)$ for $t \in [0,\infty)$ collectively, we say $S(t)$ is a random process or a stochastic process. We may show this process by \begin{align}%\label{} \big\{S(t), t \in [0,\infty) \big\}. \end{align} Therefore, a random process is a collection of random variables usually indexed by time (or sometimes by space).
A random process is a collection of random variables usually indexed by time.
The process $S(t)$ mentioned here is an example of a continuous-time random process. In general, when we have a random process $X(t)$ where $t$ can take real values in an interval on the real line, then $X(t)$ is a continuous-time random process. Here are a few more examples of continuous-time random processes:
$-$ Let $N(t)$ be the number of customers who have visited a bank from $t=9$ (when the bank opens at 9:00 am) until time $t$, on a given day, for $t\in[9,16]$. Here, we measure $t$ in hours, but $t$ can take any real value between $9$ and $16$. We assume that $N(9)=0$, and $N(t) \in \{0,1,2,...\}$ for all $t \in [9,16]$. Note that for any time $t_1$, the random variable $N(t_1)$ is a discrete random variable. Thus, $N(t)$ is a discrete-valued random process. However, since $t$ can take any real value between $9$ and $16$, $N(t)$ is a continuous-time random process.

$-$ Let $W(t)$ be the thermal noise voltage generated across a resistor in an electric circuit at time $t$, for $t\in [0,\infty)$. Here, $W(t)$ can take real values.

$-$ Let $T(t)$ be the temperature in New York City at time $t \in [0, \infty)$. We can assume here that $t$ is measured in hours and $t=0$ refers to the time we start measuring the temperature.
In all of these examples, we are dealing with an uncountable number of random variables. For example, for any given $t_1 \in [9,16]$, $N(t_1)$ is a random variable. Thus, the random process $N(t)$ consists of an uncountable number of random variables. A random process can be defined on the entire real line, i.e., $t \in (-\infty,\infty)$. In fact, it is sometimes convenient to assume that the process starts at $t=-\infty$ even if we are interested in $X(t)$ only on a finite interval. For example, we can assume that the $T(t)$ defined above is a random process defined for all $t \in \mathbb{R}$ although we get to observe only a finite portion of it.

On the other hand, you can have a discrete-time random process. A discrete-time random process is a process

\begin{align}%\label{} \big\{X(t), t \in J \big\}, \end{align} where $J$ is a countable set. Since $J$ is countable, we can write $J=\{t_1,t_2,\cdots\}$. We usually define $X(t_n)=X(n)$ or $X(t_n)=X_n$, for $n=1,2,\cdots$, (the index values $n$ could be from any countable set such as $\mathbb{N}$ or $\mathbb{Z}$). Therefore, a discrete-time random process is just a sequence of random variables. For this reason, discrete-time random processes are sometimes referred to as random sequences. We can denote such a discrete-time process as \begin{align}%\label{} \big\{X(n), n=0,1,2,\dots\} \quad \textrm{ or } \quad \big\{X_n, n=0,1,2,\dots\}. \end{align} Or, if the process is defined for all integers, then we may show the process by \begin{align} \big\{X(n), n \in \mathbb{Z}\} \quad \textrm{ or } \quad \big\{X_n, n \in \mathbb{Z}\}. \end{align} Here is an example of a discrete-time random process. Suppose that we are observing customers who visit a bank starting at a given time. Let $X_n$ for $n \in \mathbb{N}$ be the amount of time the $i$th customer spends at the bank. This process consists of a countable number of random variables \begin{align}%\label{} X_1,X_2,X_3,... \end{align} Thus, we say that the process $\big\{X_n, n=1,2,3.. \big\}$ is a discrete-time random process. Discrete-time processes are sometimes obtained from continuous-time processes by discretizing time (sampling at specific times). For example, if you only record the temperature in New York City once a day (let's say at noon), then you can define a process \begin{align}%\label{} &X_1=T(12) &\textrm{(temperature at noon on day 1, $t=12$)}\\ &X_2=T(36) &\textrm{(temperature at noon on day 2, $t=12+24$)}\\ &X_3=T(60) &\textrm{(temperature at noon on day 3, $t=12+24+24$)}\\ &... \end{align} And, in general, $X_n=T(t_n)$ where $t_n=24(n-1)+12$ for $n \in \mathbb{N}$. Here, $X_n$ is a discrete-time random process. Figure 10.2 shows a possible realization of this random process.
Disc-RP
Figure 10.2 - Possible realization of the random process $\big\{X_n, n=1,2,3,\cdots \big\}$ where $X_n$ shows the temperature in New York City at noon on day $n$.

A continuous-time random process is a random process $\big\{X(t), t \in J \big\}$, where $J$ is an interval on the real line such as $[-1,1]$, $[0, \infty)$, $(-\infty,\infty)$, etc.

A discrete-time random process (or a random sequence) is a random process $\big\{X(n)=X_n, n \in J \big\}$, where $J$ is a countable set such as $\mathbb{N}$ or $\mathbb{Z}$.



Random Processes as Random Functions:

Consider a random process $\big\{X(t), t \in J \big\}$. This random process is resulted from a random experiment, e.g., observing the stock prices of a company over a period of time. Remember that any random experiment is defined on a sample space $S$. After observing the values of $X(t)$, we obtain a function of time such as the one showed in Figure 10.1. The function shown in this figure is just one of the many possible outcomes of this random experiment. We call each of these possible functions of $X(t)$ a sample function or sample path. It is also called a realization of $X(t)$.

From this point of view, a random process can be thought of as a random function of time. You are familiar with the concept of functions. The difference here is that $\big\{X(t), t \in J \big\}$ will be equal to one of many possible sample functions after we are done with our random experiment. In engineering applications, random processes are often referred to as random signals.

A random process is a random function of time.


Example
You have $1000$ dollars to put in an account with interest rate $R$, compounded annually. That is, if $X_n$ is the value of the account at year $n$, then \begin{align}%\label{} X_n=1000(1+R)^n, \quad \textrm{ for }n=0,1,2,\cdots. \end{align} The value of $R$ is a random variable that is determined when you put the money in the bank, but it does not not change after that. In particular, assume that $R \sim Uniform(0.04,0.05)$.
  1. Find all possible sample functions for the random process $\big\{X_n, n=0,1,2,... \big\}$.
  2. Find the expected value of your account at year three. That is, find $E[X_3]$.
  • Solution
      1. Here, the randomness in $X_n$ comes from the random variable $R$. As soon as you know $R$, you know the entire sequence $X_n$ for $n=0,1,2,\cdots$. In particular, if $R=r$, then \begin{align}%\label{} X_n=1000(1+r)^n, \quad \textrm{ for all }n \in \{0,1,2,\cdots\}. \end{align} Thus, here sample functions are of the form $f(n)=1000(1+r)^n$, $n=0,1,2,\cdots$, where $r \in [0.04,0.05]$. For any $r \in [0.04,0.05]$, you obtain a sample function for the random process $X_n$.
      2. The random variable $X_3$ is given by \begin{align}%\label{} X_3=1000(1+R)^3. \end{align} If you let $Y=1+R$, then $Y \sim Uniform(1.04,1.05)$, so \begin{equation} \nonumber f_Y(y) = \left\{ \begin{array}{l l} 100 & \quad 1.04 \leq y \leq 1.05 \\ 0 & \quad \text{otherwise} \end{array} \right. \end{equation} To obtain $E[X_3]$, we can write \begin{align}%\label{} E[X_3]&=1000 E[Y^3]\\ &=1000 \int_{1.04}^{1.05} 100 y^3 \quad \textrm{d}y \quad (\textrm{by LOTUS})\\ &=\frac{10^5}{4} \bigg[ y^4\bigg]_{1.04}^{1.05}\\ &=\frac{10^5}{4} \bigg[ (1.05)^4-(1.04)^4\bigg]\\ &\approx 1,141.2 \end{align}


Example
Let $\big\{X(t), t \in [0,\infty) \big\}$ be defined as \begin{align}%\label{} X(t)=A+Bt, \quad \textrm{ for all }t \in [0,\infty), \end{align} where $A$ and $B$ are independent normal $N(1,1)$ random variables.
  1. Find all possible sample functions for this random process.
  2. Define the random variable $Y=X(1)$. Find the PDF of $Y$.
  3. Let also $Z=X(2)$. Find $E[YZ]$.
  • Solution
      1. Here, we note that the randomness in $X(t)$ comes from the two random variables $A$ and $B$. The random variable $A$ can take any real value $a \in \mathbb{R}$. The random variable $B$ can also take any real value $b \in \mathbb{R}$. As soon as we know the values of $A$ and $B$, the entire process $X(t)$ is known. In particular, if $A=a$ and $B=b$, then \begin{align}%\label{} X(t)=a+bt, \quad \textrm{ for all }t \in [0,\infty). \end{align} Thus, here, sample functions are of the form $f(t)=a+bt$, $t \geq 0$, where $a,b \in \mathbb{R}$. For any $a,b \in \mathbb{R}$ you obtain a sample function for the random process $X(t)$.
      2. We have \begin{align}%\label{} Y=X(1)=A+B. \end{align} Since $A$ and $B$ are independent $N(1,1)$ random variables, $Y=A+B$ is also normal with \begin{align}%\label{} EY&=E[A+B]\\ &=E[A]+E[B]\\ &=1+1\\ &=2, \end{align} \begin{align}%\label{} \textrm{Var}(Y)&=\textrm{Var}(A+B)\\ &=\textrm{Var}(A)+\textrm{Var}(B) \quad (\textrm{since $A$ and $B$ are independent})\\ &=1+1\\ &=2. \end{align} Thus, we conclude that $Y \sim N(2, 2)$: \begin{align}%\label{} f_Y(y)=\frac{1}{\sqrt{4 \pi}} e^{-\frac{(y-2)^2}{4}}. \end{align}
      3. We have \begin{align}%\label{} E[YZ]&=E[(A+B)(A+2B)]\\ &=E[A^2+3AB+2B^2]\\ &=E[A^2]+3E[AB]+2E[B^2]\\ &=2+3E[A]E[B]+2\cdot2 \quad (\textrm{since $A$ and $B$ are independent})\\ &=9. \end{align}


The random processes in the above examples were relatively simple in the sense that the randomness in the process originated from one or two random variables. We will see more complicated examples later on.


The print version of the book is available on Amazon.

Book Cover


Practical uncertainty: Useful Ideas in Decision-Making, Risk, Randomness, & AI

ractical Uncertaintly Cover