Multiple Random Variables#

Random variables \( X \) and \( Y \) can either represent two distinct one-dimensional random variables or serve as components of a single two-dimensional random vector. In both cases, their outcomes are considered within the joint sample space, which corresponds to the \( xy \)-plane.

Overview of Joint Distribution Function#

The joint distribution function \( F_{X,Y}(x, y) \) quantifies the probability that the outcome of an experiment results in a sample point located within the quadrant \((- \infty < X \leq x, -\infty < Y \leq y)\) of the joint sample space. This is mathematically defined as:

\[ F_{X,Y}(x, y) = \Pr ( X \leq x, Y \leq y ) \]

This function encapsulates the cumulative behavior of both random variables \( X \) and \( Y \) across the sample space.

If \( F_{X,Y}(x, y) \) is continuous and its second-order partial derivatives exist and are also continuous, the joint probability density function \( f_{X,Y}(x, y) \) is defined as:

\[ f_{X,Y}(x, y) = \frac{\partial^2 F_{X,Y}(x, y)}{\partial x \partial y} \]

This density function \( f_{X,Y}(x, y) \) describes how the probability mass is distributed over the \( xy \)-plane.

Overview of Conditional Probability Density Function#

Conditional Probability Concept
When analyzing two continuous random variables \( X \) and \( Y \), their joint behavior is described by the joint probability density function \( f_{X,Y}(x, y) \). To focus on the behavior of \( Y \) given that \( X = x \), we use the conditional probability density function \( f_{Y|X}(y|x) \).

Definition of Conditional Density
The conditional density of \( Y \) given \( X = x \) is expressed as:

\[ f_{Y|X}(y|x) = \frac{f_{X,Y}(x, y)}{f_X(x)} \]

This is valid as long as \( f_X(x) > 0 \), where \( f_X(x) \) represents the marginal probability density function of \( X \).

Properties of Conditional Density#

For any fixed \( x \), \( f_{Y|X}(y|x) \) behaves like a regular probability density function:

  • It is non-negative:

    \[ f_{Y|X}(y|x) \geq 0 \]
  • Its integral over all possible values of \( y \) equals 1:

    \[ \int_{-\infty}^{\infty} f_{Y|X}(y|x) \, dy = 1 \]

Relationship to Joint and Marginal Densities#

From the definition of conditional probability, we derive the multiplication rule:

\[ f_{X,Y}(x, y) = f_{Y|X}(y|x) f_X(x) \]

Discussion on Joint Probability versus Joint Distribution#

In the Theorem of The Probability of the Union of Two Events, we considered the concept of joint probability.

We can see that the concept of joint distribution of random variables \( X \) and \( Y \) is directly related to the joint probability \( \Pr(A \cap B) \) in the context of events \( A \) and \( B \), that we mentioned earlier.

Relationship Between Events and Random Variables#

  • Events as Random Variable Outcomes:
    Events \( A \) and \( B \) can often be expressed in terms of random variables. For example:

    • Let \( A = \{X \leq x\} \), meaning event \( A \) occurs if the random variable \( X \) takes a value less than or equal to \( x \).

    • Similarly, \( B = \{Y \leq y\} \), meaning event \( B \) occurs if \( Y \) takes a value less than or equal to \( y \).

  • Joint Probability of Events:
    The joint probability of \( A \) and \( B \), \( \Pr(A \cap B) \), is equivalent to the joint cumulative distribution function \( F_{X,Y}(x, y) \) of the random variables \( X \) and \( Y \):

    \[ \Pr(A \cap B) = \Pr(X \leq x, Y \leq y) = F_{X,Y}(x, y). \]

    This represents the probability that \( X \leq x \) and \( Y \leq y \) simultaneously.

Calculating Joint Probability from Frequency#

In experimental or empirical settings, joint probabilities are estimated using relative frequencies. The same principle applies to events \( A \) and \( B \) or their corresponding random variable outcomes:

  • \( n_{A,B} \) counts the number of trials where both \( A \) and \( B \) occur, equivalent to the number of times \( X \leq x \) and \( Y \leq y \) are satisfied.

  • \( n \) is the total number of trials.

The joint probability is approximated as:

\[ \Pr(A, B) \approx \frac{n_{A,B}}{n}. \]

As \( n \to \infty \), this converges to the true probability:

\[ \Pr(A, B) = \lim_{n \to \infty} \frac{n_{A,B}}{n}. \]