Week: 2 Statistics 2

Publish Date: June 10, 2024

Independence of Two Random Variables

In probability theory and statistics, two random variables are said to be independent if the occurrence of one does not affect the occurrence of the other. In other words, the joint probability distribution of the two variables is the product of their individual probability distributions.

Joint Probability Mass Function (PMF)

The joint PMF of two discrete random variables and is a function that gives the probability that and simultaneously.

where and are the marginal PMFs of and respectively.

Marginal Probability Mass Function (PMF)

The marginal PMF of a random variable is obtained by summing (or integrating, in the continuous case) the joint PMF over all possible values of the other random variable.


Independence

Two random variables and are independent if and only if their joint PMF is the product of their marginal PMFs.


Example

Consider two fair six-sided dice, and , representing the outcomes of the first and second dice rolls respectively. Let be the number rolled on the first die and be the number rolled on the second die.

The joint PMF of and is:

since each outcome is equally likely.

The marginal PMFs of and are:

Since for all and , and are independent.

Joint PMF Array Table:

Marginal PMF Equations: