3. Joint and Conditional Distributions#
Errata#
At 10:29 in the lecture, there’s an extra \(e\) in the denominator that shouldn’t be there. It appears to be a typo as it didn’t affect the final result.
Joint Distributions#
A joint distribution describes the behavior of two or more random variables simultaneously. If \(X = (X_1, X_2, ..., X_n)\) is a vector of random variables, then the joint probability density function (PDF) \(f(x_1, x_2, ..., x_n)\) gives the likelihood that \(X_1\) takes the value \(x_1\), \(X_2\) takes the value \(x_2\), and so on.
The joint cumulative distribution function (CDF) \(F(x_1, x_2, ..., x_n)\) gives the probability that each of \(X_1, X_2, ..., X_n\) is less than or equal to \(x_1, x_2, ..., x_n\) respectively.
For two dimensions, i.e., when \(X = (X_1, X_2)\), we write these as:
Joint PDF: \(f(x_1, x_2)\)
Joint CDF: \(F(x_1, x_2)\)
Conditional Distributions#
A conditional distribution describes the behavior of one or more random variables given the values of some other variables. If we want to know the distribution of \(X_1\) given the value of \(X_2\), written \(f(x_1 | x_2)\), its defined as the ratio of the joint distribution to the marginal distribution of \(X_2\). The marginal distribution, \(f(x_2)\), is obtained by integrating the joint distribution over all values of \(X_1\):