Week: 2 Statistics 2

Publish Date: June 10, 2024

Independence of Multiple Random Variables

Introduction

Independence is a fundamental concept in probability theory and statistics. When two or more random variables are independent, the occurrence of one does not affect the probability distribution of the others. This concept can be extended from two variables to multiple random variables.

Definition

Let be random variables. They are said to be independent if for any sets of real numbers , the following holds:

In simpler terms, the joint probability of the variables taking on any combination of values is equal to the product of their individual probabilities.

Mathematical Explanation

Consider two random variables and . They are independent if:

For multiple random variables , independence means:

In terms of joint and marginal densities, if are continuous random variables with joint density and marginal densities , then they are independent if:

Examples

Example 1: Coin Flips

Consider flipping three coins. Let , , and represent the outcomes of each flip, where 1 indicates heads and 0 indicates tails. Each flip is independent of the others. The probability of getting heads on each flip is . Therefore:

Unknown environment 'table'\begin{table}[h!] \centering \begin{tabular}{|c|c|c|c|c|c|} \hline Outcome & \( X_1 \) & \( X_2 \) & \( X_3 \) & Probability Calculation & Probability \\ \hline 1 & 0 & 0 & 0 & \( \frac{1}{2} \cdot \frac{1}{2} \cdot \frac{1}{2} \) & \( \frac{1}{8} \) \\ \hline 2 & 0 & 0 & 1 & \( \frac{1}{2} \cdot \frac{1}{2} \cdot \frac{1}{2} \) & \( \frac{1}{8} \) \\ \hline 3 & 0 & 1 & 0 & \( \frac{1}{2} \cdot \frac{1}{2} \cdot \frac{1}{2} \) & \( \frac{1}{8} \) \\ \hline 4 & 0 & 1 & 1 & \( \frac{1}{2} \cdot \frac{1}{2} \cdot \frac{1}{2} \) & \( \frac{1}{8} \) \\ \hline 5 & 1 & 0 & 0 & \( \frac{1}{2} \cdot \frac{1}{2} \cdot \frac{1}{2} \) & \( \frac{1}{8} \) \\ \hline 6 & 1 & 0 & 1 & \( \frac{1}{2} \cdot \frac{1}{2} \cdot \frac{1}{2} \) & \( \frac{1}{8} \) \\ \hline 7 & 1 & 1 & 0 & \( \frac{1}{2} \cdot \frac{1}{2} \cdot \frac{1}{2} \) & \( \frac{1}{8} \) \\ \hline 8 & 1 & 1 & 1 & \( \frac{1}{2} \cdot \frac{1}{2} \cdot \frac{1}{2} \) & \( \frac{1}{8} \) \\ \hline \end{tabular} \caption{Outcomes and Probabilities of Flipping Three Coins} \end{table}

Example 2: Independent Normal Variables

Let and be independent random variables with standard normal distributions, and . Their joint distribution is:

Example 3: Rolling Dice

Consider rolling two six-sided dice. Let and be the results of the first and second dice, respectively. The outcomes of the dice are independent. The probability of any specific combination of outcomes (e.g., and ) is:

Properties

  1. Symmetry: Independence is symmetric, meaning if is independent of , then is independent of .
  2. Transitivity: If and are independent, and and are independent, then and are independent.
  3. Pairwise Independence: Pairwise independence does not imply mutual independence. Three random variables can be pairwise independent but not mutually independent.