# 5.A: Appendix- Mathematics of Random Processes

$$\newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} }$$ $$\newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}}$$$$\newcommand{\id}{\mathrm{id}}$$ $$\newcommand{\Span}{\mathrm{span}}$$ $$\newcommand{\kernel}{\mathrm{null}\,}$$ $$\newcommand{\range}{\mathrm{range}\,}$$ $$\newcommand{\RealPart}{\mathrm{Re}}$$ $$\newcommand{\ImaginaryPart}{\mathrm{Im}}$$ $$\newcommand{\Argument}{\mathrm{Arg}}$$ $$\newcommand{\norm}[1]{\| #1 \|}$$ $$\newcommand{\inner}[2]{\langle #1, #2 \rangle}$$ $$\newcommand{\Span}{\mathrm{span}}$$ $$\newcommand{\id}{\mathrm{id}}$$ $$\newcommand{\Span}{\mathrm{span}}$$ $$\newcommand{\kernel}{\mathrm{null}\,}$$ $$\newcommand{\range}{\mathrm{range}\,}$$ $$\newcommand{\RealPart}{\mathrm{Re}}$$ $$\newcommand{\ImaginaryPart}{\mathrm{Im}}$$ $$\newcommand{\Argument}{\mathrm{Arg}}$$ $$\newcommand{\norm}[1]{\| #1 \|}$$ $$\newcommand{\inner}[2]{\langle #1, #2 \rangle}$$ $$\newcommand{\Span}{\mathrm{span}}$$

This appendix presents the essential mathematics required to describe the statistical properties of a random process such as noise. Also, it is difficult to analyze circuits with digitally modulated signals unless the signals are treated as being random with high-order statistics. Several statistical terms are introduced to describe the properties of a random variable $$X$$ and how its value at one time is related to its value at other times. So $$X$$ will, in general, vary with time (i.e., it can be written as $$X(t)$$) and its value at a particular time will be random.

This section presents all of the probability metrics required to describe noise, interference, and digitally modulated signals. A random process in time $$t$$ is a family of random variables $$\{X(t), t ∈ T \}$$, where $$t$$ is somewhere in the time interval $$T$$. The probability that $$X(t)$$ has a value less than $$x_{1}$$ is denoted by $$P\{X(t) ≤ x_{1}\}$$. This is also called the cumulative distribution function (CDF), and sometimes called just the distribution function (DF):

$\label{eq:1}F_{X}(x_{1})=P\{X\leq x_{1}\}$

In general, since $$X(t)$$ varies with time, the CDF required to handle noise, that is, random voltages and currents, will depend on time. So the CDF used with noise and interference in communications will have two arguments, and the multivariate CDF is

$\label{eq:2}F_{X}(x_{1};t_{1})=P\{X(t_{1})\leq x_{1}\}$

That is, $$F_{X}(∞, t)=1$$ and $$F(−∞, t)=0$$. $$F_{X}$$ is being used with two arguments, as it is necessary to indicate the value of $$X$$ at a particular time. $$F_{X}(x_{1};t_{1})$$ is said to be the first-order distribution of $$X(t)$$.

Another probability metric often used is the probability density function (PDF), $$f$$, which is related to the CDF as

$\label{eq:3}F_{X}(x_{1};t_{1})=\int_{-\infty}^{x_{1}}f(x,t_{1}) dx$

Another property that needs to be captured is the relationship between the value of the random variable at one time, $$t_{1}$$, to its value at another time, $$t_{2}$$. Clearly, if the variables are completely random there would be no relationship. The statistical relationship of $$x$$ at $$t_{1}$$ and at $$t_{2}$$ is described by the joint CDF, $$F_{X}(x_{1}, x_{2};t_{1}, t_{2})$$, which is the second-order distribution of the random process:

$\label{eq:4}F_{X}(x_{1}, x_{2};t_{1}, t_{2}) = P\{X(t_{1}) ≤ x_{1}, X(t_{2}) ≤ x_{2}\}$

This is the joint CDF, or probability, that $$X(t_{1})$$ will be less than $$x_{1}$$ at time $$t_{1}$$ and also that $$X(t_{2})$$ will be less than $$x_{2}$$ at time $$t_{2}$$. So in the case of noise, the joint CDF describes the correlation of noise at two different times. In general, the $$n$$th order distribution is

$\label{eq:5}F_{X}(x_{1},\ldots ,x_{n};t_{1},\ldots ,t_{n}) = P\{X(t_{1}) ≤ x_{1},\ldots ,X(t_{n}) ≤ x_{n}\}$

If the random process (i.e., $$X$$) is discrete, then the probability mass function (PMF) is used for the probability that $$x$$ has a particular value. The general form of the PMF is

$\label{eq:6}p_{X}(x_{1},\ldots ,x_{n};t_{1},\ldots ,t_{n}) = P\{X(t_{1}) = x_{1},\ldots ,X(t_{n}) = x_{n}\}$

and the general form of the PDF, used with continuous random variables, is (from Equation $$\eqref{eq:3}$$)

$\label{eq:7}f_{X}(x_{1},\ldots ,x_{n};t_{1},\ldots ,t_{n})=\frac{\partial^{n}F_{X}(x_{1},\ldots ,x_{n};t_{1},\ldots ,t_{n})}{\partial x_{1}\ldots \partial x_{n}}$

The statistical measures above describe the properties of random variables and capture the way a variable is related to itself at different times, and how two different random processes are related to each other at the same time and at different times. Such characterizations are based on the expected value of a random variable. The expectation of a random variable $$X(t)$$ is the weighted average of all possible

Figure $$\PageIndex{1}$$: Gaussian distribution.

values of the random variable. The weighting is the probability for a discrete random variable, and is the probability density for a continuous random variable. So the expected value of the random variable $$X(t)$$ is

$\label{eq:8}E[X(t)]=\left\{\begin{array}{ll}{\sum_{-\infty}^{\infty}x(t)p_{X}(x,t)}&{\text{for a discrete random variable}}\\{\int_{-\infty}^{\infty}x(t)p_{X}(x,t)dx}&{\text{for a continuous random variable}}\end{array}\right.$

This is just the mean of $$X(t)$$ defined as

$\label{eq:9}\mu_{X}=\overline{X}(t)=\langle X(t)\rangle =E[X(t)]$

The mean is also called the first-order moment of $$X(t)$$. $$E[\:\: ]$$ is called the expected value of a random variable and the term is synonymous with the expectation, mathematical expectation, mean, and first moment of a random variable. The symbols $$\langle\:\:\rangle$$ are a clean way of specifying the expectation. In general a computer program would need to be used to calculate the expected value. However, for some assumed probability distributions there are analytic solutions for $$E[\:\: ]$$.

The $$n$$th-order moment of $$X(t)$$ is just the expected value of the $$n$$th power of $$X(t)$$:

$\label{eq:10}\mu_{n}'=E[X^{n}(t)]=\langle X^{n}(t)\rangle =\left\{\begin{array}{ll}{\sum_{-\infty}^{\infty}x^{n}(t)p_{X}(x,t)}&{\text{for a discrete}}\\{}&{\text{random variable}}\\{\int_{-\infty}^{\infty}x^{n}(t)p_{X}(x,t)dx}&{\text{for a continuous}}\\{}&{\text{random variable}}\end{array}\right.$

Thus the second moment, sometimes called the second raw moment, is $$\mu_{2}′=\langle X^{2}(t)\rangle = E[X^{2}(t)]$$. A more useful quantity for characterizing the statistics of a signal is the second central moment, which is the second moment about the mean. The second central moment of a random variable is also called its variance, written as $$\sigma^{2}$$ or as $$\mu_{2}$$:

\begin{align}\sigma^{2}=\mu_{2}&= E[(X −\mu)^{2}] = E[X^{2} − 2\mu X + \mu^{2}] = E[X^{2}] − 2\mu E[X] + \mu^{2}\nonumber \\&= E[X^{2}] − 2\mu^{2} + \mu^{2} = E[X^{2}] − \mu^{2} = E[X^{2}] − (E[X])^{2}\nonumber \\ \label{eq:11}&=\langle X^{2}(t)\rangle −\mu^{2} =\langle X^{2}(t)\rangle −\langle X^{2} (t)\rangle\end{align}

The variance is a measure of the dispersion of a random variable (i.e., how much the random variable is spread out). Variance is one of several measures of dispersion, but it is the preferred measure when working with noise and digitally modulated signals. The standard deviation, $$\sigma$$, is the square root of the variance $$\sigma^{2}$$. It is also common to denote the variance of $$X$$ as $$\sigma_{X}^{2}$$, and in general, the variance can be a function of time, $$\sigma^{2}(t)$$.

It is common to approximate the statistical distribution of a digitally modulated signal as a Gaussian or normal distribution. This distribution is shown in Figure $$\PageIndex{1}$$ and is mathematically described by its probability distribution

$\label{eq:12}p[X(t)]=\frac{1}{\sqrt{2\pi}\sigma}e^{-(X(t)-\mu)/(2\sigma^{2})}$

where the mean of the distribution is $$\mu$$ and the variance is $$\sigma^{2}$$. The third and higher moments of the Gaussian distribution are zero. That is one of the reasons why this distribution is so commonly used as an approximation. Analysis using the Gaussian distribution is much simpler than it would be for other distributions. More realistically, a digitally modulated signal has $$I$$ and $$Q$$ components and the distribution of each of these should be approximated as a Gaussian distribution. Such a distribution is called a complex Gaussian distribution and the analysis of the distortion produced by an amplifier using the complex Gaussian distribution is more accurate. Using a more sophisticated distribution, e.g. using the moments calculated from the actual digitally modulated signal, provides even greater accuracy in analysis [43] but now the complexity is beyond manual calculation.

If a digitally modulated signal is completely random, then there would be no correlation between the value of the signal at one time and its value at another time. However, there is a relationship and the relationship is described by the signal’s autocorrelation function. The autocorrelation function is used to describe the relationship between values of a function separated by different instants of time. For a random variable, it is given by

$\label{eq:13}R_{X}(t_{1},t_{2})=E[X(t_{1})X(t_{2})]$

When the random variable is discrete, the one-dimensional autocorrelation function of a random sequence of length $$N$$ is expressed as

$\label{eq:14}R_{X}(i)=\sum_{j=0}^{N-1}x_{j}x_{j+i}$

where $$i$$ is the so-called lag parameter. The autocovariance function of $$X(t)$$ is given by

\begin{align}K_{X}(t_{1},t_{2})&=E[\{X(t_{1}) −\mu_{X}(t_{1})\}]E[\{X(t_{1}) − \mu_{X}(t_{1})\}]\nonumber \\ \label{eq:15}&=R_{X}(t_{1},t_{2})-\mu_{X}(t_{1})\mu_{X}(t_{2})\end{align}

while the variance is given by

$\label{eq:16}\sigma_{X}(t)= E[\{X(t) −\mu_{X}(t)\}]^{2} = K_{X}(t_{1}, t_{1})$

A random process $$X(t)$$ is stationary in the strict sense if

$\label{eq:17}F_{X}(x_{1},\ldots ,x_{n};t_{1},\ldots ,t_{n}) = F_{X}(x_{1},\ldots ,x_{n};t_{1} +\tau ,\ldots , t_{n} +\tau )$

$$\forall t_{i} ∈ T$$, $$i ∈ \mathbf{N}$$. If the random process is wide-sense stationary (WSS), then it is stationary with order $$2$$. This means that in most cases only its first and second moments (i.e., the mean and autocorrelation) are independent of time $$t$$ and only dependent on the time interval $$\tau$$. More precisely, if a random process is WSS, then

\begin{align} E[X(t)]&=\mu\quad \text{(i.e., its mean is a constant)}\nonumber \\ \label{eq:18} \text{and}\:\:R_{X}(t,s)&=E[X(t)X(s)]=R_{X}(|s-t|)=R_{X}(\tau )\end{align}

Note that the autocorrelation of a WSS process is dependent only the time difference $$\tau$$. A random process that is not stationary to any order is nonstationary (i.e., its moments are explicitly dependent on time).

The discussion now returns to the properties of a Gaussian random process. A Gaussian random process is a continuous random process with PDF of the form

$\label{eq:19}f_{X}(x,t)=\frac{1}{\sqrt{2\pi\sigma (t)^{2}}}\text{exp}\left(\frac{-\mu (x,t)^{2}}{2\sigma (t)^{2}}\right)$

where $$\mu (x, t)$$ represents the mean of the random process and $$\sigma (t)$$ represents the variance. A normal random process is a special case of a Gaussian random process in that it has a mean of zero and a variance of unity. A Poisson random process is a discrete random process with parameter $$\lambda (t) > 0$$ and has a PDF given by

$\label{eq:20}p_{X}(k)=P(X(t)=k)=e^{\lambda t}\frac{(\lambda t)^{k}}{k!}$

where $$\lambda (t)$$ is generally time dependent. The mean and variance of a Poisson random process is $$\lambda (t)$$. So, for a Poisson random process,

\begin{align}\label{eq:21} \mu_{X}&=E[X(t)]=\lambda (t) \\ \label{eq:22}\sigma_{X}^{2}&=\text{Var}(X(t))=\lambda (t)\end{align}

The statistical measures above are those necessary to statistically describe digitally modulated signals and also describe most noise processes.

This page titled 5.A: Appendix- Mathematics of Random Processes is shared under a CC BY-NC license and was authored, remixed, and/or curated by Michael Steer.