Skip to main content
Engineering LibreTexts

5.4: Joint Events and Conditional Probabilities

  • Page ID
    50185
  • \( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \) \( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)\(\newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\) \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\) \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\) \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\) \( \newcommand{\Span}{\mathrm{span}}\) \(\newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\) \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\) \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\) \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\) \( \newcommand{\Span}{\mathrm{span}}\)\(\newcommand{\AA}{\unicode[.8,0]{x212B}}\)

    You may be interested in the probability that the symbol chosen has two different properties. For example, what is the probability that the freshman chosen is a woman from Texas? Can we find this, \(p(W, TX)\), if we know the probability that the choice is a woman, \(p(W)\), and the probability that the choice is from Texas, \(p(TX)\)?

    Not in general. It might be that 47% of the freshmen are women, and it might be that (say) 5% of the freshmen are from Texas, but those facts alone do not guarantee that there are any women freshmen from Texas, let alone how many there might be.

    However, if it is known or assumed that the two events are independent (the probability of one does not depend on whether the other event occurs), then the probability of the joint event (both happening) can be found. It is the product of the probabilities of the two events. In our example, if the percentage of women among freshmen from Texas is known to be the same as the percentage of women among all freshmen, then

    \(p(W, TX) = p(W)p(TX) \tag{5.4}\)

    Since it is unusual for two events to be independent, a more general formula for joint events is needed. This formula makes use of “conditional probabilities,” which are probabilities of one event given that another event is known to have happened. In our example, the conditional probability of the selection being a woman, given that the freshman selected is from Texas, is denoted \(p(W | TX)\) where the vertical bar, read “given,” separates the two events—the conditioning event on the right and the conditioned event on the left. If the two events are independent, then the probability of the conditioned event is the same as its normal, or “unconditional” probability.

    In terms of conditional probabilities, the probability of a joint event is the probability of one of the events times the probability of the other event given that the first event has happened:

    \(\begin{align*} p(A, B) & = p(B)p(A \;|\; B) \\ & = p(A)p(B \;| \;A) \tag{5.5} \end{align*} \)

    Note that either event can be used as the conditioning event, so there are two formulas for this joint probability. Using these formulas you can calculate one of the conditional probabilities from the other, even if you don’t care about the joint probability.

    This formula is known as Bayes’ Theorem, after Thomas Bayes, the eighteenth century English mathematician who first articulated it. We will use Bayes’ Theorem frequently. This theorem has remarkable generality. It is true if the two events are physically or logically related, and it is true if they are not. It is true if one event causes the other, and it is true if that is not the case. It is true if the outcome is known, and it is true if the outcome is not known.

    Thus the probability \(p(W, TX)\) that the student chosen is a woman from Texas is the probability \(p(TX)\) that a student from Texas is chosen, times the probability \(p(W | TX)\) that a woman is chosen given that the choice is a Texan. It is also the probability \(P(W)\) that a woman is chosen, times the probability \(p(TX | W)\) that someone from Texas is chosen given that the choice is a woman.

    \(\begin{align*} p(W, TX) & = p(TX)p(W \;|\; TX) \\ & = p(W)p(TX \;|\; W) \tag{5.6} \end{align*} \)

    As another example, consider the table of students above, and assume that one is picked from the entire student population “at random” (meaning with equal probability for all individual students). What is the probability \(p(M, G)\) that the choice is a male graduate student? This is a joint probability, and we can use Bayes’ Theorem if we can discover the necessary conditional probability.

    The fundamental partition in this case is the 10,206 fundamental events in which a particular student is chosen. The sum of all these probabilities is 1, and by assumption all are equal, so each probability is 1/10,220 or about 0.01%.

    The probability that the selection is a graduate student \(p(G)\) is the sum of all the probabilities of the 048 fundamental events associated with graduate students, so \(p(G)\) = 6,048/10,220.

    Given that the selection is a graduate student, what is the conditional probability that the choice is a man? We now look at the set of graduate students and the selection of one of them. The new fundamental partition is the 6,048 possible choices of a graduate student, and we see from the table above that 4,226 of these are men. The probabilities of this new (conditional) selection can be found as follows. The original choice was “at random” so all students were equally likely to have been selected. In particular, all graduate students were equally likely to have been selected, so the new probabilities will be the same for all 6,048. Since their sum is 1, each probability is 1/6,048. The event of selecting a man is associated with 4,226 of these new fundamental events, so the conditional probability \(p(M | G)\) = 4,226/6,048. Therefore from Bayes’ Theorem:

    \[\begin{align*} p(M, G) & = p(G)p(M \;|\; G) \\ & = \dfrac{6,048}{10,220} \times \dfrac{4,226}{6,048} \\ & = \dfrac{4,226}{10,220} \tag{5.7} \end{align*} \]

    This problem can be approached the other way around: the probability of choosing a man is p(M) = 6,541/10,220 and the probability of the choice being a graduate student given that it is a man is p(G | M) = 4,226/6,541 so (of course the answer is the same)

    \[\begin{align*} p(M, G) & = p(M)p(G \;|\; M) \\ & = \dfrac{6,541}{10,220} \times \dfrac{4,226}{6,541} \\ & = \dfrac{4,226}{10,220} \tag{5.8} \end{align*} \]


    This page titled 5.4: Joint Events and Conditional Probabilities is shared under a CC BY-NC-SA 4.0 license and was authored, remixed, and/or curated by Paul Penfield, Jr. (MIT OpenCourseWare) via source content that was edited to the style and standards of the LibreTexts platform; a detailed edit history is available upon request.