14  Jointly Distributed Random Variables

14.1 Joint Cumulative Distribution Function

Suppose that \(X\) and \(Y\) are two random variables defined on the same sample space \(\Omega\). The joint cumulative distribution function of \(X\) and \(Y\) is the function \[F(x,y) = P\{X \le x, Y \le y\} \qquad -\infty < x, y < \infty\] The cdf of \(X\) and \(Y\) can be obtained from the joint distribution as follows: \[\begin{align}F_X(a) &= F_{XY}(a, \infty) \\ F_Y(b) &= F_{XY}(\infty, b)\end{align}\] ## Discrete Case In the case where \(X\) and \(Y\) are both discrete random variables, it is convenient to define the joint probability mass function of \(X\) and \(Y\) by \[p(x,y) = P\{X=x, Y=y\}\] The probability mass function of \(X\) and \(Y\) may be obtained from p(x, y) by: \[\begin{align}p_X(x) &= \sum_{y:p(x,y) > 0} p(x,y) \\ p_Y(y) &= \sum_{x:p(x,y) > 0} p(x,y)\end{align}\] ## Continuous Case We say that \(X\) and \(Y\) are jointly continuous if there exists a function \(f (x, y)\), defined for all real x and y, having the property that for all sets \(A\subset \mathbb R^2\): \[P\{(X,Y)\in A\} = \iint_{A} f(x,y)\] The function \(f(x, y)\) is called the joint probability density function of \(X\) and \(Y\). The probability density of \(X\) and \(Y\) can be obtained from the joint density by the following: \[\begin{align}f_X(x) &= \int_{-\infty}^\infty f(x,y) dy \\ f_Y(y) &= \int_{-\infty}^\infty f(x,y) dx\end{align}\] ## Expectation If \(X\) and \(Y\) are random variables and \(g\) is a function of two variables, then \[\begin{align} \text E[g(X,Y)]&= \sum_{x}\sum_{y} g(x,y) p(x,y) \quad &\text{in the discrete case}\\ &= \int_{-\infty}^\infty \int_{-\infty}^\infty g(x,y)f(x,y)dxdy &\text{in the continuous case}\end{align}\]

Corollary: For any constants \(a_i\)’s, we have: \[\text E\bigg[\sum_{i=1}^n a_iX_i\bigg] =\sum_{i=1}^n a_i \text E[X_i]\]