MTH4120 Home

Joint distributions of random variables

Definition. The joint cumulative distribution function of random variables \(X\) and \(Y\) is defined as \[F_{X,Y}(s,t)=\mathbb P\left(X\leq s, Y\leq t\right).\]

Definition. The random variables \(X\) and \(Y\) are called jointly continuous random variables with joint probability density function \(f_{X,Y}(u,v)\) if the following equality holds for every pair \((s,t)\) \[F_{X,Y}(s,t)=\int_{-\infty}^{t}\int_{-\infty}^{s}f_{X,Y}(u,v)\,dudv.\]

Problem 1. Define \(f(x,y)\) by \begin{eqnarray*} f(x,y) = \left\{\begin{array}{ll} C e^{-\lambda y} & \mbox{if } 0 \leq x \leq y \\ 0 & \mbox{otherwise} \end{array}\right. \end{eqnarray*} In other words, \(f(x,y) = C e^{-\lambda y} \cdot 1_{A}(x,y)\), where \(A = \{0\leq x \leq y\}\).
  • (a) Determine the constant \(C\) so that \(f\) is a joint pdf of some bivariate random variable \((X,Y)\).
  • (b) Determine the probability density functions \(f_X\) and \(f_Y\) of random variables \(X\) and \(Y\). (These pdfs are called marginal densities.)
  • (c) Find the probability \(\mathbb P\left(Y \leq 2X\right)\).

Normal random variables

Recall that if \(W\sim N\left(\mu,\sigma^2\right)\) is a normal random variable, then its moment generating function satisfies \begin{eqnarray*}m_W(t)=\mathbb E\left[e^{tW}\right]=e^{\mu t+\frac12\sigma^2t^2}.\end{eqnarray*} Conversely, if a moment generating function of a random variable \(Q\) satisfies \[m_Q(t)=e^{\mu t+\frac12\sigma^2t^2},\] then \(Q\) is a normal random variable with expectation \(\mu\) and variance \(\sigma^2\).

Using the above, we can solve the following problem.

Problem 2. Assume that \(X\) and \(Y\) are independent standard normal random variables. Prove that \(7X+5Y\) is a normal random variable. Calculate its expected value and variance.

Problem 3. Assume that \(X\) and \(Y\) are independent standard normal random variables. The random variables \(U\) and \(V\) are defined as \begin{eqnarray*} U&=&\alpha X+\beta Y\\ V&=&\beta X+ \gamma Y, \end{eqnarray*} where \(\alpha\), \(\beta\), and \(\gamma\) are real numbers that satisfy \(\alpha\gamma-\beta^2 > 0\).
  • (a) Determine the probability density functions \(f_U\) and \(f_V\) of \(U\) and \(V\).
  • (b) Prove that the covariance matrix between \(U\) and \(V\) is equal to \(\left[\begin{array}{cc}\alpha&\beta\\ \beta&\gamma\end{array}\right]^2\).

Problem 4.

Assume that \(X\) and \(Y\) are independent standard normal random variables. Find the joint probability density function \(f_{U,V}(s,t)\) of the random variables \(U\) and \(V\) defined as \begin{eqnarray*} U&=&\alpha X+\beta Y\\ V&=&\beta X+ \gamma Y, \end{eqnarray*} where \(\alpha\), \(\beta\), and \(\gamma\) are real numbers that satisfy \(\alpha\gamma-\beta^2 > 0\).

Prove that the probability density function \(f_{U,V}(s,t)\) can be expressed as \[f_{U,V}(s,t)=\frac1{2\pi\cdot \sqrt{\mbox{det}(\Sigma)} } \cdot e^{-\frac12 \left[\begin{array}{cc} s &t \end{array}\right] \cdot \Sigma^{-1}\cdot \left[\begin{array}{c}s \\ t\end{array}\right]},\] where \(\Sigma\) is the matrix defined as \[\Sigma= \left[\begin{array}{cc}\alpha&\beta\\ \beta&\gamma\end{array}\right]^2.\]

Definition. Random variables \(U\) and \(V\) are said to have bivariate normal distribution with vector of expectations \(\left[\begin{array}{c}\mu\\ \nu\end{array}\right]\) and covariance matrix \(\Sigma\), if their joint probability density function is \[f_{U,V}(s,t)=\frac1{ 2\pi \cdot \sqrt{\mbox{det}(\Sigma)} }\cdot e^{-\frac12 \left[\begin{array}{cc} s-\mu &t-\nu \end{array}\right] \cdot \Sigma^{-1}\cdot \left[\begin{array}{c}s -\mu\\ t-\nu\end{array}\right]}.\]

The random variables \(U\) and \(V\) from Problem 4 have bivariate normal distribution with expectations \(\left[\begin{array}{c}0\\ 0\end{array}\right]\) and covariance matrix \(\Sigma=\left[\begin{array}{cc}\alpha&\beta\\ \beta&\gamma\end{array}\right]^2\).

Reduction of bivariate normal distribution to independent normal random variables

Theorem. If \(U\) and \(V\) have bivariate normal distribution with expectations \(\left[\begin{array}{c}\mu\\ \nu\end{array}\right]\) and covariance matrix \(\Sigma=\left[\begin{array}{cc}\sigma^2 & \rho\\ \rho & \eta^2 \end{array}\right]\) then there exist independent standard normal random variables \(Z\) and \(W\) and scalars \(p\), \(q\), and \(r\) such that \begin{eqnarray*} U&=&\mu+p Z\\ V&=&\nu+q Z+ r W. \end{eqnarray*}

Problem 5. Assume that \(X\) and \(Y\) have bivariate normal distribution. Assume that their expectations are \(\mathbb E\left[X\right]=2\) and \(\mathbb E\left[Y\right]=1\) and that the covariance matrix between \(X\) and \(Y\) is \(\left[\begin{array}{cc}9&2\\ 2& 1\end{array}\right]\).
  • (a) Find the real numbers \(\alpha\) and \(\beta\) for which there exist a standard normal random variable \(Z\) such that \(X+3Y=\alpha+\beta Z\).
  • (b) Evaluate \(\mathbb P\left(X+3Y\leq 7\right)\).

Problem 6. Assume that \(X\) and \(Y\) have joint normal distribution, that each of \(X,Y\sim N(0,1)\), and that their covariance is \(\frac12\). Calculate \(\mathbb P\left(X\geq 0, Y\geq 0\right)\).