4  Conditional probability

Author

Karl Gregory

Two events can be related to one another such that the occurrence of one event makes the occurrence of the other event more or less likely.\

Definition 4.1 (Conditional probability) For events \(A\) and \(B\), the conditional probability of \(A\) given \(B\) is the probability that the \(A\) occurs given \(B\) occurs. This probability is defined as \[ P(A|B) = \frac{P(A\cap B)}{P(B)}. \]

Example 4.1 (Exam pass and homework completion) On the day of an exam, select a student at random and let \(A\) be the event that he or she passes the exam.

  • Now consider only students who completed the last homework assignment. One would expect the probability of the event \(A\) to be higher when sampling a student from this group than when sampling from the whole class.
  • On the other hand, consider only students who did not complete the last homework assignment. One would expect the probability of the event \(A\) to be lower when sampling a student from this group than when sampling from the whole class.

Consider selecting a student at random from the whole class and letting \(B\) be the event that the selected student completed the last homework assignment. Then:

  • The conditional probability \(P(A|B)\) is the probability that a student selected at random from the group of students who completed the last homework assignment will pass the exam We expect this to be higher than \(P(A)\), which is the probability that a student drawn at random from the entire class will pass the exam.
  • The conditional probability \(P(A|B^c)\) is the probability that a student selected at random from the group of students who did not complete the last homework assignment will pass the exam. We expect this to be lower than \(P(A)\), which is the probability that a student drawn from the whole class will pass the exam.

Suppose that two-thirds of the students in the class completed the last homework assignment and that one half of the class completed the last homework assignment and passed the exam. Then we would write \[ P(A|B) = \frac{P(A \cap B)}{P(B)} = \frac{1/2}{2/3} = 0.75. \]

Example 4.2 (Meeting friends at a party) Consider the statistical experiment of going to a party and noting which of your friends you meet there. Let \(A\) be the event that you meet Adrian at the party and \(B\) be the event that you meet Beverly. Suppose \(P(A) = 0.4\), \(P(B) = 0.6\), and \(P(A \cap B) = 0.35\). Then:

  1. The probability that you meet Adrian given that you meet Beverly is \[ \begin{align*} P(A|B) & = \frac{P(A \cap B)}{P(B)} \\ & = \frac{0.35}{0.6} \\ & = 0.5833333. \end{align*} \]
  2. The probability that you meet Adrian given that you do not meet Beverly is \[ \begin{align*} P(A|B^c) & = \frac{P(A \cap B^c)}{P(B^c)} \\ & = \frac{P(A) - P(A \cap B)}{1 - P(B)} \\ & = \frac{0.4 - 0.35}{1 - 0.6} \\ & = 0.125 \end{align*} \]

An interpretation of the above is that if you meet Beverly, you are then more likely to meet Adrian; if you do not meet Beverly, you are less likely to meet Adrian.

Exercise 4.1 (Gamecocks versus Aggies) Suppose \(80250\) spectators are in attendance at the next Gamecocks versus Aggies football game, and one lucky winner will be selected at random to receive free “big gulps” for life from the concession stand. Suppose \(20000\) of the spectators are students and that there are \(10000\) Aggies present, \(200\) of which are students. Let \(G\) be the event that a Gamecock is chosen and let \(S\) be the event that a student is chosen. Find:

  1. \(P(G)\), the probability that a Gamecock is chosen.1
  2. \(P(S)\), the probability that a student is chosen.2
  3. \(P(G^c)\), the probability that an Aggie is chosen.3
  4. \(P(G^c \cap S)\), the probability that an Aggie student is chosen.4
  5. \(P(G|S)\), the probability a Gamecock is chosen, given that a student is chosen.5
  6. \(P(S|G^c)\), the probability a student is chosen, given an Aggie is chosen.6

We can begin making a table of the spectators based on the provided information as follows:

Gamecock Aggie Total
Student 19800 200 20000
Non-student 50450 9800 60250
Total 70250 10000 80250

Note that when we “condition” on an event, we may regard the sample points identified with the conditioning event as the new sample space.

A direct consequence of the definition of the conditional probability is the following result, which states that we can always compute an intersection probability as a conditional probability times an un-conditional probability.

Proposition 4.1 (Multiplicative rule of probablity) For any two events \(A\) and \(B\) we have \[ P(A\cap B) = P(A|B)P(B) \] or equivalently \[ P(A\cap B) = P(B|A)P(A). \]

A probability which is not a conditional probability is sometimes called a marginal probability. So, Proposition 4.1 states that an intersection probability can always be written as a conditional probability times a marginal probability.

Exercise 4.2 (Safari) Suppose you go on a safari in South Africa and make a list of the animals you see. Let \(G\) be the event that you see a giraffe, \(W\) the event that you see a Wildebeest, and \(C\) the event that you see a crocodile. Assume

  • \(P(W) = 0.40\)
  • \(P(C) = 0.60\)
  • \(P(G) = 0.20\)
  • \(P(C|W) = 0.775\)
  • \(P(C|G) = 0.65\)
  • \(P(G\cap W) = 0.06\)
  • \(P(G\cap W\cap C) = 0.01\)

Fill in all the spaces on a Venn diagram (all possible disjoint events). 7


  1. \(P(G) = 70250/80250 = 0.8754\)↩︎

  2. \(P(S) = 20000/80250 = 0.2492\)↩︎

  3. \(P(G^c) = 1- P(G) = 0.1246\)↩︎

  4. \(P(G^c \cap S) = 200/80250 = 0.0025\)↩︎

  5. The answer is \[ P(G|S) = \frac{P(G \cap S)}{P(S)}, \] so we need \(P(G \cap S) = 19800/80250\) as well as \(P(S) = 20000/80250\). The answer is thus \[ P(G|S) = P(G \cap S) \times \frac{1}{P(S)} = \frac{19800}{80250} \times \frac{80250}{20000} = \frac{19800}{20000} = 0.99. \] ↩︎

  6. \(P(S|G^c) = \dfrac{P(G^c \cap S)}{P(G^c)} = \dfrac{200}{10000} = 0.02.\)↩︎

  7. We will need the probability \[ P(W \cap C) = P(C|W)P(W) = 0.775 \times 0.40 = 0.31 \] as well as \[ P(G \cap C) = P(C|G)P(G) = 0.65 \times 0.20 = 0.13. \]↩︎