5 Independence
$$
$$
Two events which have nothing to do with other are said to be independent. More precisely, if two events are independent, the occurrence of one does not affect the probability of the other. Formally:
Definition 5.1 (Independence of two events) Two events \(A\) and \(B\) are independent if \(P(A \cap B ) = P(A)P(B)\).
Note that the three equations below are equivalent, meaning that any one of them implies the other two:
- \(P(A \cap B ) = P(A)P(B)\)
- \(P(A|B) = P(A)\)
- \(P(B|A) = P(B)\)
Because of this, we can check whether \(A\) and \(B\) are independent by checking if any one of these equalities holds (or fails to hold).
Example 5.1 (Flip a coin twice) Flip a coin twice and record the flips. Let \(A_1\) be the event that flip one is heads and \(A_2\) be the event that flip two is heads. Our intuition tells us that the two flips should be independent; the outcome of the first flip has no bearing on the second flip, so we should have \(P(A_2 | A_1) = P(A_2) = 1/2\). So, what is the probability that both flips are heads? We have \[ P(A_1 \cap A_2) = P(A_1) P(A_2) = 1/2 \times 1/2 = 1/4. \]
Example 5.2 (Flat tire on bike) Let \(A\) be the event that your bike gets a flat tire on a given day and let \(B\) be the event that you have forgotten to bring a spare tube. Suppose \(P(A) = 0.02\) and \(P(B)=0.10\) and \(P(A \cap B) = 0.002.\) Then the events are independent, since \(P(A\cap B) = P(A)P(B)\).
Our definition of independence concerns only two events. The way we define independence among a collection of more than two events is a little different. We call it mutual independence.
Definition 5.2 (Mutual independence among a collection of events) Events \(A_1,A_2,\dots\) are mutually independent if for any subcollection \(A_{i_1},\dots,A_{i_K}\), we have \[ P\left(\bigcap_{j=1}^K A_{i_j}\right) = \prod_{j=1}^K P(A_{i_j}) \]
It looks like we are getting a little crazy with the indices in this definition. Basically the definition says that if I take any number of the events from among \(A_1,A_2,\dots\), the probability of their intersection must be equal to the product of their individual probabilities. If there are only two events \(A_1\) and \(A_2\), mutual independence of these is the same as ordinary independence between any two events.
In the next example we use the definition of mutual independence.
Example 5.3 (Survey responses) Suppose you send out a survey to \(10\) randomly selected people and suppose that each person will complete the survey with probability \(0.20\). Define the events \[ \begin{array}{cc} R_1 \text{:} & \text{1st person responds} \\ \vdots& \vdots\\ R_{10}\text{:} & \text{10th person responds} \end{array} \]
Assume \(R_1,\dots,R_{10}\) are mutually independent events.
- What is the probability that everyone completes the survey? \[ \begin{align*} P(\text{everyone completes survey}) & = P(R_1 \cap R_2\cap \cdots \cap R_{10}) \\ & = P(R_1) \times P(R_2) \times \cdots \times P(R_{10}) \\ & = 0.20^{10} \\ & = 0.0000001024 = 1.024\times 10^{-7} \end{align*} \]
- What is the probability that no one completes the survey? \[ \begin{align*} P(\text{no one completes survey}) & = P(R_1^c \cap R_2^c\cap \cdots \cap R_{10}^c) \\ & = P(R_1^c) \times P(R_2^c) \times \cdots \times P(R_{10}^c) \\ & = 0.80^{10} \\ & = 0.1074 \end{align*} \]