Conditional Probability Mass Function (PMF) of More than Two Discrete Random Variables
Introduction
In probability theory, the Conditional Probability Mass Function (PMF) provides the probability distribution of a discrete random variable given that certain conditions are satisfied by other random variables. When dealing with more than two discrete random variables, the conditional PMF describes the probability of one variable given specific values of the other variables.
Definition
Let
This can be computed using the joint PMF
where
Calculation
To calculate the conditional PMF:
-
Determine the Joint PMF: The joint PMF of
is: -
Determine the Marginal PMF: The marginal PMF of
is: -
Apply the Formula: Substitute the joint and marginal PMFs into the conditional PMF formula:
Meaning and Importance
Understanding conditional PMFs is crucial for:
- Modeling Dependencies: It helps in understanding the dependency of one variable on the others in a probabilistic sense.
- Bayesian Inference: Conditional probabilities are foundational in Bayesian statistics, allowing updates of beliefs based on new evidence.
- Data Analysis: In multivariate data analysis, conditional PMFs help in understanding relationships between different variables.
Example: Coin Toss
Consider a scenario with three coins. Let
The joint PMF can be defined as follows:
for all , because each coin has two possible outcomes, and all combinations are equally likely.
To find the conditional PMF of
-
Joint PMF:
-
Marginal PMF:
-
Conditional PMF:
So, given
This result is expected because the outcomes are independent. Hence, knowing the outcomes of
So, given that