Skip to main content
Engineering LibreTexts

8.3: Making Random Numbers

  • Page ID
    47267
    • Franz S. Hover & Michael S. Triantafyllou
    • Massachusetts Institute of Technology via MIT OpenCourseWare
    \( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \) \( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)\(\newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\) \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\) \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\) \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\) \( \newcommand{\Span}{\mathrm{span}}\) \(\newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\) \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\) \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\) \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\) \( \newcommand{\Span}{\mathrm{span}}\)\(\newcommand{\AA}{\unicode[.8,0]{x212B}}\)

    The Monte Carlo method requires that we fire into our evaluation \(g(x)\) a group of \(N\) random numbers (or sets of random numbers), drawn from a distribution (or a set of distributions for more than one element in \(x\)). Here we describe how to generate such data from simple distributions.

    Note that both the normal and the uniform distributions are captured in standard MATLAB commands.

    We describe in particular how to generate samples of a given distribution, from random numbers taken from an underlying uniform distribution. First, we say that the cumulative probability function of the uniform distribution is

    \[ P(w) = \begin{cases} 0, \quad w \leq 0 \\[4pt] w, \quad 0 < w < 1 \\[4pt] 1, \quad w \geq 1 \end{cases} \]

    If \(x = r(w)\) where \(r\) is the transformation we seek, recall that the cumulative probabilities are

    \[ P(x) \, = \, P(r(w)) \, = \, P(w) \, = \, w, \] and the result we need is that \[ w \, = \, P(x) \, = \int\limits_{- \infty}^{x} p(x) \, dx. \]Our task is to come up with an \(x\) that goes with the uniformly distributed \(w\) - it is not as hard as it would seem. As an example, suppose we want to generate a normal variable \(x\) (zero mean, unity variance). We have

    \begin{align} P(x) \, = \int\limits_{- \infty}^{x} \dfrac{1}{\sqrt{2 \pi}} e^{-t^2 / 2} \, dt \, &= \, w \\[4pt] F(x) \, &= \, w, \textrm { or} \\[4pt] x \, &= \, F^{-1} (w), \end{align}

    where \(F(x)\) is the cumulative probability function of a standard Gaussian variable (zero mean, unity variance), and can be looked up or calculated with standard routines. Note \(F(x)\) is related within a scale factor to the error function (\( \textrm{erf} \)).

    As another example, consider the exponential distribution

    \[ p(x) \, = \, \lambda e^{- \lambda x}; \]this distribution is often used to describe the time of failure in complex systems. We have

    \begin{align} P(x) \, = \int\limits_{0}^{x} \lambda e^{- \lambda t} \, dt \, &= \, w \\[4pt] 1 - e^{- \lambda x} \, &= \, w \textrm { or} \\[4pt] x \, &= \, - \dfrac{\log (1-w)}{\lambda}. \end{align}

    Similarly, this procedure applied to the Rayleigh distribution \[ p(x) \, = \, x e^{-x^2 / 2} \] gives \(x = \sqrt{-2 \log (1-w)}\). In these formulas, we can replace \((w-1)\) with \(w\) throughout; \(w\) is uniformly distributed on the interval \([0, \, 1]\), so they are equivalent.


    This page titled 8.3: Making Random Numbers is shared under a CC BY-NC-SA 4.0 license and was authored, remixed, and/or curated by Franz S. Hover & Michael S. Triantafyllou (MIT OpenCourseWare) via source content that was edited to the style and standards of the LibreTexts platform; a detailed edit history is available upon request.