Independence is a fundamental concept in probability theory and statistics. When two or more random variables are independent, the occurrence of one does not affect the probability distribution of the others. This concept can be extended from two variables to multiple random variables.
Definition
Let be random variables. They are said to be independent if for any sets of real numbers , the following holds:
In simpler terms, the joint probability of the variables taking on any combination of values is equal to the product of their individual probabilities.
Mathematical Explanation
Consider two random variables and . They are independent if:
For multiple random variables , independence means:
In terms of joint and marginal densities, if are continuous random variables with joint density and marginal densities , then they are independent if:
Examples
Example 1: Coin Flips
Consider flipping three coins. Let , , and represent the outcomes of each flip, where 1 indicates heads and 0 indicates tails. Each flip is independent of the others. The probability of getting heads on each flip is . Therefore:
Example 2: Independent Normal Variables
Let and be independent random variables with standard normal distributions, and . Their joint distribution is:
Example 3: Rolling Dice
Consider rolling two six-sided dice. Let and be the results of the first and second dice, respectively. The outcomes of the dice are independent. The probability of any specific combination of outcomes (e.g., and ) is:
Properties
Symmetry: Independence is symmetric, meaning if is independent of , then is independent of .
Transitivity: If and are independent, and and are independent, then and are independent.
Pairwise Independence: Pairwise independence does not imply mutual independence. Three random variables can be pairwise independent but not mutually independent.