Skip to main content
Engineering LibreTexts

3.3: Bayes' Rule

  • Page ID
    47234
    • Franz S. Hover & Michael S. Triantafyllou
    • Massachusetts Institute of Technology via MIT OpenCourseWare
    \( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \) \( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)\(\newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\) \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\) \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\) \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\) \( \newcommand{\Span}{\mathrm{span}}\) \(\newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\) \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\) \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\) \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\) \( \newcommand{\Span}{\mathrm{span}}\)\(\newcommand{\AA}{\unicode[.8,0]{x212B}}\)

    Consider a composite event \(M\) and a simple event \(A_i\). We know from conditional probability from the previous section that

    \begin{align*} p(A_i|M) = \dfrac{p(A_i \cap M)}{p(M)} \\[4pt] p(M|A_i) = \dfrac{p(A_i \cap M)}{p(A_i)}, \end{align*}

    and if we eliminate the denominator on the right-hand side, we find that

    \begin{align*} p(M|A_i) = \dfrac{p(A_i|M) \, p(M)}{p(A_i)} \\[4pt] p(A_i|M) = \dfrac{p(M|A_i) \, p(A_i)}{p(M)}. \end{align*}

    The second of these is most interesting - it gives the probability of a simple event, conditioned on the composite event, in terms of the composite event conditioned on the simple one! Recalling our above formula for \(p(M)\), we thus derive Bayes’ rule:

    \[ p(A_i|M) = \dfrac{p(M|A_i) \, p(A_i)}{p(M|A_1) \, p(A_1) \, + \, . \, . \, . + \, p(M|A_n) \, p(A_n)}. \]

    Here is an example of its use.

    Example \(\PageIndex{1}\)

    Consider a medical test that is 99% accurate - it gives a negative result for people who do not have the disease 99% of the time, and it gives a positive result for people who do have the disease 99% of the time. Only one percent of the population has this disease. Joe just got a positive test result: What is the probability that he has the disease?

    Solution

    The composite event \(M\) is that he has the disease, and the simple events are that he tested positive \((+)\) or he tested negative \((-)\). We apply

    \begin{align*} p(M|+) &= \dfrac{p(+|M) \, p(M)}{p(+)} \\[4pt] &= \dfrac{p(+|M) \, p(M)} {p(+|M) \, p(M) + p(+|\bar{M}) \, p(\bar{M})} \\[4pt] &= \dfrac{0.99 \times 0.01} {0.99 \times 0.01 + 0.01 \times 0.99} \\[4pt] &= 1/2. \end{align*}

    This example is not well appreciated by many healthcare consumers!

    Here is another example, without so many symmetries.

    Example \(\PageIndex{2}\)

    Box A has nine red pillows in it and one white. Box B has six red pillows in it and nine white. Selecting a box at random and pulling out a pillow at random gives the result of a red pillow. What is the probability that it came from Box A?

    Solution

    \(M\) is the composite event that it came from Box A; the simple event is that a red pillow was collected \((R)\). We have

    \begin{align*} p(M|R) &= \dfrac{p(R|M) \, p(M)}{p(R)} \\[4pt] &= \dfrac{p(R|M) \, p(M)} {p(R|M) \, p(M) + p(R|\bar{M}) \, p(\bar{M})} \\[4pt] &= \dfrac{0.9 \times 0.5} {0.9 \times 0.5 + 0.4 \times 0.5} \\[4pt] &= 0.692. \end{align*}


    This page titled 3.3: Bayes' Rule is shared under a CC BY-NC-SA 4.0 license and was authored, remixed, and/or curated by Franz S. Hover & Michael S. Triantafyllou (MIT OpenCourseWare) via source content that was edited to the style and standards of the LibreTexts platform; a detailed edit history is available upon request.