Skip to main content
Engineering LibreTexts

4.3: Stationarity

  • Page ID
    47241
    • Franz S. Hover & Michael S. Triantafyllou
    • Massachusetts Institute of Technology via MIT OpenCourseWare
    \( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \) \( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)\(\newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\) \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\) \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\) \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\) \( \newcommand{\Span}{\mathrm{span}}\) \(\newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\) \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\) \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\) \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\) \( \newcommand{\Span}{\mathrm{span}}\)\(\newcommand{\AA}{\unicode[.8,0]{x212B}}\)

    A stationary random process is one whose ensemble statistics do not depend on time. Intuitively, this means that if we were to sample a sequence of processes, at the same time within each process, and compute statistics of this data set, we would find no dependence of the statistics on the time of the samples. Aircraft engine noise is a stationary process in level flight, whereas the sound of live human voices is not. For a stationary process, \(m(t) = m\), i.e., the ensemble mean has no dependence on time. The same is true for the other statistics: \(V(t) = R(t, \, 0) = V\), and \(R(t, \, \tau) = R(\tau)\). Formally, a stationary process has all ensemble statistics independent of time, whereas our case that the mean, variance, and autocorrelation functions are independent of time defines a (weaker) second-order stationary process.

    Here is an example: \(y_i(t) = a \cos (\omega_o t + \theta_i)\), where \(\theta_i\) is a random variable, distributed uniformly in the range \([0, 2\pi]\). Is this process stationary? We have to show that all three of the ensemble statistics are independent of time:

    \begin{align*} E(y(t)) &= \dfrac{1}{2 \pi} \int\limits_{0}^{2 \pi} a \cos (\omega_o t + \theta) \, d\theta = 0 \\[4pt] R(t, \, \tau) &= E(y(t) y(t + \tau) \\[4pt] &= \dfrac{1}{2 \pi} \int\limits_{0}^{2 \pi} a^2 \cos (\omega_o t + \theta) \cos (\omega_o (t + \tau) + \theta) \, d\theta \\[4pt] &= \dfrac{1}{2} a^2 \cos (\omega_o \tau) \\[4pt] V(t) &= R(t, \, 0). \end{align*}

    Thus the process is second-order stationary.

    As noted above, the statistics of a stationary process are not necessarily the same as the time averages. A very simple example of this is a coin toss, in which heads triggers \(x_1(t) = 1\) and \(x_2(t) = 2\). Clearly the mean on time of \(x_1(t)\) is one, but the ensemble mean at any time is \(E(x(t_o)) = 1.5\). This difference occurs here even though the process is obviously stationary.

    When the ensemble statistics and the time averages are the same, we say that the process is ergodic. Continuing our example above, let us calculate now the time averages:

    \begin{align*} m(y_i(t)) &= \lim_{T \to \infty} \dfrac{1}{T} \int\limits_{0}^{T} a \cos (\omega_o t + \theta_i) \, dt \\[4pt] &= \lim_{T \to \infty} \dfrac{1}{T} \, a \, \dfrac{1}{\omega_o} \sin (\omega_o t + \theta_i) |_0^T \\[4pt] &= 0; \\[4pt] R^t (\tau) &= \lim_{T \to \infty} \dfrac{1}{T} \int\limits_{0}^{T} a^2 \cos (\omega_o t + \theta_i) \cos (\omega_o (t + \tau) + \theta_i) \, dt \\[4pt] &= \dfrac{1}{2} a^2 \cos (\omega_o \tau); \\[4pt] V^t &= R^t(0) = \dfrac{a^2}{2}. \end{align*}

    So a sinusoid at random phase is an ergodic process. Indeed, this form is a foundation for modeling natural random processes such as ocean waves, atmospheric conditions, and various types of noise. In particular, it can be verified that the construction

    \[ y(t) = \sum_{n=1}^N a_n \cos (\omega_n t + \theta_n), \]

    where the \(\theta_n\) are independently and uniformly distributed in \([0, \, 2 \pi]\) is stationary and ergodic. It has mean zero, and autocorrelation

    \[ R(\tau) = \sum_{n=1} \dfrac{a_n ^2}{2} \cos (\omega_n \tau). \]

    We now make two side notes. Under stationary and ergodic conditions, the autocorrelation function is symmetric on positive and negative \(\tau\) because we can always write

    \[ R(\tau) = E(x(t)x(t + \tau)) = E(x(t' - \tau)x(t')), \textrm{ where } t' = t + \tau. \]

    Furthermore, we have the inequality that \(R(0) \geq |R(\tau))|\) for any \(\tau\). To see this,

    \begin{align} 0 \leq E[x(t) + x(t + \tau))^2] \, &= \, E[x(t)^2] + 2E[x(t)x(t + \tau)] + E[x(t + \tau)^2] \\[4pt] &= \, 2R(0) + 2R(\tau); \textrm{ similarly, } \nonumber \\[4pt] {} \nonumber \\[4pt] 0 \leq E[x(t) - x(t + \tau))^2] \, &= E[x(t)^2] - 2E[x(t)x(t + \tau)] + E[x(t + \tau)^2] \\[4pt] &= \, 2R(0) - 2R(\tau). \nonumber \end{align}

    The only way both of these can be true is if \(R(0) \geq |R(\tau)|\).


    This page titled 4.3: Stationarity is shared under a CC BY-NC-SA 4.0 license and was authored, remixed, and/or curated by Franz S. Hover & Michael S. Triantafyllou (MIT OpenCourseWare) via source content that was edited to the style and standards of the LibreTexts platform; a detailed edit history is available upon request.