Skip to main content
Engineering LibreTexts

3.4: Random Variables

  • Page ID
    47235
    • Franz S. Hover & Michael S. Triantafyllou
    • Massachusetts Institute of Technology via MIT OpenCourseWare
    \( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \) \( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)\(\newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\) \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\) \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\) \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\) \( \newcommand{\Span}{\mathrm{span}}\) \(\newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\) \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\) \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\) \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\) \( \newcommand{\Span}{\mathrm{span}}\)\(\newcommand{\AA}{\unicode[.8,0]{x212B}}\)

    Now we assign to each event \(A_i\) in the sample space a given value: each \(A_i\) corresponds with an \(x_i\). For instance, a coin toss resulting in heads could be equated with a $1 reward, and each tails could trigger a $1 loss. Dollar figures could be assigned to each of the faces of a die. Hence we see that if each event \(A_i\) has a probability, then so will the numerical values \(x_i\).

    The average value of \(x_i\) can be approximated of course by sampling the space \(N\) times, summing all the \(x\)'s, and dividing by \(N\). As \(N\) becomes bigger, this computation will give an increasingly accurate result. In terms of probabilities the formula for the expected value is

    \[ \bar{x} = E(x) = \sum_{i=1}^{n} p(A_i) x_i. \]

    The equivalence of this expected value with the numerical average is seen as follows: if the space is sampled \(N\) times, and the number of results \([A_i, x_i]\) is \(k_i\), then \( p(A_i) \simeq k_i / N \).

    Superposition is an important property of the expectation operator:

    \[ E(x+y) = E(x) + E(y). \]

    The mean of a function of \(x\) is defined using probabilities of the random variable \(x\):

    \[ E[f(x(\xi))] = \sum_{i=1}^{n} f(x_i) p_i . \]

    Another important property of a random variable is the variance - a measure of how much the \(x\) varies from its own mean:

    \begin{align} \sigma ^2 &= E \left[ (x - \bar{x}) ^2 \right] \\[4pt] &= E(x^2) - \bar{x} ^2. \end{align}

    The second line is apparent because \(E(−2x \bar{x}) = −2 \bar{x}^2\). Note we use the symbol \( \sigma ^2 \) for variance; the standard deviation \(\sigma\) is just the square root, and has the same units as does the random variable \(x\).


    This page titled 3.4: Random Variables is shared under a CC BY-NC-SA 4.0 license and was authored, remixed, and/or curated by Franz S. Hover & Michael S. Triantafyllou (MIT OpenCourseWare) via source content that was edited to the style and standards of the LibreTexts platform; a detailed edit history is available upon request.