7.3.0 End of Chapter Problems

Problem

Let $X_i$ be i.i.d. $Uniform(0,1)$. We define the sample mean as

\begin{align}%\label{} M_n=\frac{X_1+X_2+...+X_n}{n}. \end{align}
  1. Find $E[M_n]$ and $\mathrm{Var}(M_n)$ as a function of $n$.
  2. Using Chebyshev's inequality, find an upper bound on \begin{align}%\label{} P\left(\left|M_n-\frac{1}{2}\right| \geq \frac{1}{100}\right). \end{align}
  3. Using your bound, show that \begin{align}%\label{} \lim_{n \rightarrow \infty} P\left(\left|M_n-\frac{1}{2}\right| \geq \frac{1}{100}\right)=0. \end{align}


Problem

The number of accidents in a certain city is modeled by a Poisson random variable with an average rate of $10$ accidents per day. Suppose that the number of accidents on different days are independent. Use the central limit theorem to find the probability that there will be more than $3800$ accidents in a certain year. Assume that there are $365$ days in a year.




Problem

In a communication system, each codeword consists of $1000$ bits. Due to the noise, each bit may be received in error with probability $0.1$. It is assumed bit errors occur independently. Since error correcting codes are used in this system, each codeword can be decoded reliably if there are less than or equal to $125$ errors in the received codeword, otherwise the decoding fails. Using the CLT, find the probability of decoding failure.



Problem

$50$ students live in a dormitory. The parking lot has the capacity for $30$ cars. Each student has a car with probability $\frac{1}{2}$ , independently from other students. Use the CLT (with continuity correction) to find the probability that there won't be enough parking spaces for all the cars.




Problem

The amount of time needed for a certain machine to process a job is a random variable with mean $EX_i=10$ minutes and $\mathrm{Var}(X_i)=2$ minutes$^{2}$. The times needed for different jobs are independent from each other. Find the probability that the machine processes less than or equal to $40$ jobs in $7$ hours.




Problem

You have a fair coin. You toss the coin $n$ times. Let $X$ be the portion of times that you observe heads. How large $n$ has to be so that you are $95\%$ sure that $0.45 \leq X \leq 0.55$? In other words, how large $n$ has to be so that

\begin{equation} P\big(0.45 \leq X \leq 0.55\big) \geq .95 \qquad ? \end{equation}


Problem

An engineer is measuring a quantity $q$. It is assumed that there is a random error in each measurement, so the engineer will take $n$ measurements and reports the average of the measurements as the estimated value of $q$. Specifically, if $Y_i$ is the value that is obtained in the $i$'th measurement, we assume that

\begin{align}%\label{} Y_i=q+X_i, \end{align} where $X_i$ is the error in the $i$th measurement. We assume that $X_i$'s are i.i.d. with $EX_i=0$ and $\mathrm{Var}(X_i)=4$ units. The engineer reports the average of measurements \begin{align}%\label{} M_n=\frac{Y_1+Y_2+...+Y_n}{n}. \end{align} How many measurements does the engineer need to make until he is $95\%$ sure that the final error is less than $0.1$ units? In other words, what should the value of $n$ be such that \begin{equation} P\big(q-0.1 \leq M_n \leq q+0.1\big) \geq 0.95 \qquad ? \end{equation}

Problem

Let $X_2$, $X_3$, $X_4$, $\cdots$ be a sequence of random variables such that

\begin{equation} \nonumber F_{X_n}(x) = \left\{ \begin{array}{l l} \frac{e^{n(x-1)}}{1+e^{n(x-1)}} & \quad x>0\\ & \quad \\ 0 & \quad \text{otherwise} \end{array} \right. \end{equation} Show that $X_n$ converges in distribution to $X=1$.

Problem

Let $X_2$, $X_3$, $X_4$, $\cdots$ be a sequence of random variables such that

\begin{equation} \nonumber F_{X_n}(x) = \left\{ \begin{array}{l l} \frac{e^{nx}+xe^n}{e^{nx}+ \left(\frac{n+1}{n}\right) e^n} & \quad 0 \leq x \leq 1 \\ & \quad \\ \frac{e^{nx}+e^n}{e^{nx}+ \left(\frac{n+1}{n}\right) e^n} & \quad x > 1 \end{array} \right. \end{equation} Show that $X_n$ converges in distribution to $Uniform(0,1)$.

Problem

Consider a sequence $\{X_n, n=1,2,3, \cdots \}$ such that

\begin{equation} \nonumber X_n = \left\{ \begin{array}{l l} n & \quad \textrm{with probability } \frac{1}{n^2} \\ & \quad \\ 0 & \quad \textrm{with probability } 1-\frac{1}{n^2} \end{array} \right. \end{equation} Show that
  1. $X_n \ \xrightarrow{p}\ 0$.
  2. $ X_n \ \xrightarrow{L^r}\ 0$, for $r<2$.
  3. $X_n$ does not converge to $0$ in the $r$th mean for any $r \geq 2$.
  4. $ X_n \ \xrightarrow{a.s.}\ 0$.



Problem

We perform the following random experiment. We put $n \geq 10$ blue balls and $n$ red balls in a bag. We pick $10$ balls at random (without replacement) from the bag. Let $X_n$ be the number of blue balls. We perform this experiment for $n=10, 11, 12, \cdots$. Prove that $X_n \ \xrightarrow{d}\ Binomial\left(10,\frac{1}{2}\right)$.




Problem

Find two sequences of random variables $\{X_n, n=1,2,\cdots \}$ and $\{Y_n, n=1,2,\cdots \}$ such that

\begin{align}%\label{} &X_n \ \xrightarrow{d}\ X, \\ & \qquad \textrm{and}\\ &Y_n \ \xrightarrow{d}\ Y, \end{align} but $X_n+Y_n$ does not converge in distribution to $X+Y$.


Problem

Let $X_1$, $X_2$, $X_3$, $\cdots$ be a sequence of continuous random variable such that

\begin{equation} \nonumber f_{X_n}(x) = \frac{n}{2} e^{{\large -n|x|}}. \end{equation} Show that $X_n$ converges in probability to $0$.

Problem

Let $X_1$, $X_2$, $X_3$, $\cdots$ be a sequence of continuous random variable such that

\begin{equation} \nonumber f_{X_n}(x) = \left\{ \begin{array}{l l} \frac{1}{nx^2} & \quad x>\frac{1}{n}\\ & \quad \\ 0 & \quad \text{otherwise} \end{array} \right. \end{equation} Show that $X_n$ converges in probability to $0$.

Problem

Let $Y_1$, $Y_2$, $Y_3$, $\cdots$ be a sequence of i.i.d. random variables with mean $EY_i=\mu$ and finite variance $\mathrm{Var}(Y_i)=\sigma^2$. Define the sequence $\{X_n, n=2,3,...\}$ as

\begin{equation} X_n=\frac{Y_1 Y_2+ Y_2 Y_3+ \cdots Y_{n-1}Y_n+ Y_n Y_1}{n}, \qquad \textrm{ for }n=2,3,\cdots. \end{equation} Show that $ X_n \ \xrightarrow{p}\ \mu^2$.


Problem

Let $Y_1$, $Y_2$, $Y_3$, $\cdots$ be a sequence of positive i.i.d. random variables with $0 < E[\ln Y_i]=\gamma<\infty$. Define the sequence $\{X_n, n=1,2,3,...\}$ as

\begin{equation} X_n=(Y_1 Y_2 Y_3 \cdots Y_{n-1}Y_n)^{\frac{1}{n}}, \qquad \textrm{ for }n=1,2,3,\cdots. \end{equation} Show that $ X_n \ \xrightarrow{p}\ e^{\gamma}$.


Problem Let $X_1$, $X_2$, $X_3$, $\cdots$ be a sequence of random variable such that \begin{align}%\label{eq:union-bound} X_n \sim Poisson(n\lambda), \qquad \textrm{ for }n=1,2,3,\cdots, \end{align} where $\lambda>0$ is a constant. Define a new sequence $Y_n$ as \begin{align}%\label{eq:union-bound} Y_n = \frac{1}{n} X_n, \qquad \textrm{ for }n=1,2,3,\cdots. \end{align} Show that $Y_n$ converges in mean square to $\lambda$, i.e., $ Y_n \ \xrightarrow{m.s.}\ \lambda$.


Problem

Let $\{X_n, n=1,2,\cdots \}$ and $\{Y_n, n=1,2,\cdots \}$ be two sequences of random variables, defined on the sample space $S$. Suppose that we know

\begin{align}%\label{} &X_n \ \xrightarrow{L^r}\ X,\\ &Y_n \ \xrightarrow{L^r}\ Y. \end{align} Prove that $X_n+Y_n \ \xrightarrow{L^r}\ X+Y$. Hint: You may want to use Minkowski's inequality which states that for two random variables $X$ and $Y$ with finite moments, and $1 \leq p < \infty$, we have \begin{align}%\label{} E \bigg[\big{|}X+Y\big{|}^p \bigg] \leq E\big[|X|^p \big]^{{\large \frac{1}{p}}}+E\big[|Y|^p \big]^{{\large \frac{1}{p}}}. \end{align}


Problem

Let $X_1$, $X_2$, $X_3$, $\cdots$ be a sequence of random variable such that $X_n \sim Rayleigh(\frac{1}{n})$, i.e.,

\begin{equation} \nonumber f_{X_n}(x)= \left\{ \begin{array}{l l} n^2x \exp\left\{-\frac{n^2x^2}{2}\right\} & \quad x>0\\ 0 & \quad \textrm{otherwise} \end{array} \right. \end{equation} Show that $X_n \ \xrightarrow{a.s.}\ 0$.


Problem

Let $Y_1$,$Y_2$, $\cdots$ be independent random variables, where $Y_n \sim Bernoulli\left({\large \frac{n}{n+1}} \right)$ for $n=1,2,3, \cdots$. We define the sequence $\{X_n, n=2,3,4, \cdots \}$ as

\begin{equation} X_{n+1}=Y_1 Y_2 Y_3 \cdots Y_n, \qquad \textrm{ for }n=1,2,3,\cdots. \end{equation} Show that $ X_n \ \xrightarrow{a.s.}\ 0$.


The print version of the book is available on Amazon.

Book Cover


Practical uncertainty: Useful Ideas in Decision-Making, Risk, Randomness, & AI

ractical Uncertaintly Cover