Constructing a bivariate normal from three univariate normals
I'm trying to construct correlated bivariate normal random variables from three univariate normal random variables. I realize there is a formula for constructing a bivariate normal random variable from two univariate random normal variables, but I have reasons for wanting to adjust two previously sampled variables by a third in order to give them a correlation $rho$.
Based on the approaches for constructing bivariate normals from two univariate normals, I came up with the following approach and need help verifying its correctness.
First, imagine that we have three univariate normal variables. For simplicity here, we just assume they all have $sigma=1$.
$$ X_0 sim Normal(0, 1)$$
$$ Y_0 sim Normal(0, 1)$$
$$ Z sim Normal(0, 1)$$
Given these three univariate random variables, I construct two new random variables using the following linear combinations:
$$ X = |rho| * Z + sqrt{1-rho^2} * X_0$$
$$ Y = rho * Z + sqrt{1-rho^2} * Y_0$$
where $rho in [-1, 1]$ represents the correlation coefficient between the two univariate normals.
Can someone help me formally verify that $X$ and $Y$ are now correlated random variables with correlation $rho$?
I've convinced my self through empirical simulation. Here are plots of values sampled from $X$ and $Y$ for the cases where $rho=0$, $rho=1$, and $rho=-1$.
Plot of X vs. Y for rho of 0
Plot of X vs. Y for rho of 1
Plot of X vs. Y for rho of -1
These plots are exactly as I would expect, but it would be nice to have a formal proof based on my construction. Thanks in advance!
proof-verification random-variables normal-distribution
New contributor
add a comment |
I'm trying to construct correlated bivariate normal random variables from three univariate normal random variables. I realize there is a formula for constructing a bivariate normal random variable from two univariate random normal variables, but I have reasons for wanting to adjust two previously sampled variables by a third in order to give them a correlation $rho$.
Based on the approaches for constructing bivariate normals from two univariate normals, I came up with the following approach and need help verifying its correctness.
First, imagine that we have three univariate normal variables. For simplicity here, we just assume they all have $sigma=1$.
$$ X_0 sim Normal(0, 1)$$
$$ Y_0 sim Normal(0, 1)$$
$$ Z sim Normal(0, 1)$$
Given these three univariate random variables, I construct two new random variables using the following linear combinations:
$$ X = |rho| * Z + sqrt{1-rho^2} * X_0$$
$$ Y = rho * Z + sqrt{1-rho^2} * Y_0$$
where $rho in [-1, 1]$ represents the correlation coefficient between the two univariate normals.
Can someone help me formally verify that $X$ and $Y$ are now correlated random variables with correlation $rho$?
I've convinced my self through empirical simulation. Here are plots of values sampled from $X$ and $Y$ for the cases where $rho=0$, $rho=1$, and $rho=-1$.
Plot of X vs. Y for rho of 0
Plot of X vs. Y for rho of 1
Plot of X vs. Y for rho of -1
These plots are exactly as I would expect, but it would be nice to have a formal proof based on my construction. Thanks in advance!
proof-verification random-variables normal-distribution
New contributor
Just to clarify based on the below solution. $X_0$, $Y_0$, and $Z$ are independent.
– Chris MacLellan
Jan 5 at 23:24
add a comment |
I'm trying to construct correlated bivariate normal random variables from three univariate normal random variables. I realize there is a formula for constructing a bivariate normal random variable from two univariate random normal variables, but I have reasons for wanting to adjust two previously sampled variables by a third in order to give them a correlation $rho$.
Based on the approaches for constructing bivariate normals from two univariate normals, I came up with the following approach and need help verifying its correctness.
First, imagine that we have three univariate normal variables. For simplicity here, we just assume they all have $sigma=1$.
$$ X_0 sim Normal(0, 1)$$
$$ Y_0 sim Normal(0, 1)$$
$$ Z sim Normal(0, 1)$$
Given these three univariate random variables, I construct two new random variables using the following linear combinations:
$$ X = |rho| * Z + sqrt{1-rho^2} * X_0$$
$$ Y = rho * Z + sqrt{1-rho^2} * Y_0$$
where $rho in [-1, 1]$ represents the correlation coefficient between the two univariate normals.
Can someone help me formally verify that $X$ and $Y$ are now correlated random variables with correlation $rho$?
I've convinced my self through empirical simulation. Here are plots of values sampled from $X$ and $Y$ for the cases where $rho=0$, $rho=1$, and $rho=-1$.
Plot of X vs. Y for rho of 0
Plot of X vs. Y for rho of 1
Plot of X vs. Y for rho of -1
These plots are exactly as I would expect, but it would be nice to have a formal proof based on my construction. Thanks in advance!
proof-verification random-variables normal-distribution
New contributor
I'm trying to construct correlated bivariate normal random variables from three univariate normal random variables. I realize there is a formula for constructing a bivariate normal random variable from two univariate random normal variables, but I have reasons for wanting to adjust two previously sampled variables by a third in order to give them a correlation $rho$.
Based on the approaches for constructing bivariate normals from two univariate normals, I came up with the following approach and need help verifying its correctness.
First, imagine that we have three univariate normal variables. For simplicity here, we just assume they all have $sigma=1$.
$$ X_0 sim Normal(0, 1)$$
$$ Y_0 sim Normal(0, 1)$$
$$ Z sim Normal(0, 1)$$
Given these three univariate random variables, I construct two new random variables using the following linear combinations:
$$ X = |rho| * Z + sqrt{1-rho^2} * X_0$$
$$ Y = rho * Z + sqrt{1-rho^2} * Y_0$$
where $rho in [-1, 1]$ represents the correlation coefficient between the two univariate normals.
Can someone help me formally verify that $X$ and $Y$ are now correlated random variables with correlation $rho$?
I've convinced my self through empirical simulation. Here are plots of values sampled from $X$ and $Y$ for the cases where $rho=0$, $rho=1$, and $rho=-1$.
Plot of X vs. Y for rho of 0
Plot of X vs. Y for rho of 1
Plot of X vs. Y for rho of -1
These plots are exactly as I would expect, but it would be nice to have a formal proof based on my construction. Thanks in advance!
proof-verification random-variables normal-distribution
proof-verification random-variables normal-distribution
New contributor
New contributor
New contributor
asked Jan 5 at 22:36
Chris MacLellanChris MacLellan
133
133
New contributor
New contributor
Just to clarify based on the below solution. $X_0$, $Y_0$, and $Z$ are independent.
– Chris MacLellan
Jan 5 at 23:24
add a comment |
Just to clarify based on the below solution. $X_0$, $Y_0$, and $Z$ are independent.
– Chris MacLellan
Jan 5 at 23:24
Just to clarify based on the below solution. $X_0$, $Y_0$, and $Z$ are independent.
– Chris MacLellan
Jan 5 at 23:24
Just to clarify based on the below solution. $X_0$, $Y_0$, and $Z$ are independent.
– Chris MacLellan
Jan 5 at 23:24
add a comment |
1 Answer
1
active
oldest
votes
I presume $X_0, Y_0$ and $Z$ are assumed to be independent? Well, if we define
$$ mathbf U := begin{bmatrix} X_0 \ Y_0 \Z end{bmatrix}, mathbf V := begin{bmatrix} X \ Yend{bmatrix},$$
then
$$ mathbf V = mathbf M mathbf U,$$
where
$$ mathbf M := begin{bmatrix} sqrt{1- rho^2} & 0 & | rho| \ 0 & sqrt{1- rho^2} & rho end{bmatrix}$$
Since $X_0, Y_0$ and $Z$ are independent with unit variance, the covariance matrix for $mathbf U$ is the $3 times 3$ identity matrix:
$$mathbb E[mathbf U mathbf U^T] = mathbf I.$$
Hence the covariance matrix for $mathbf V$ is given by
$$ mathbb E [mathbf V mathbf V^T ] = mathbf M mathbb E[mathbf U mathbf U^T ] mathbf M^T = mathbf Mmathbf M^T=begin{bmatrix} 1 & rho | rho| \ rho | rho| & 1end{bmatrix}$$
Thus the correlation coefficient between $X$ and $Y$ is
$$ rho_{X,Y} = frac{mathbb E[XY]}{sqrt{mathbb E[X^2] mathbb E[{Y^2]}}} = frac{rho| rho|}{sqrt{1 times 1}} = rho|rho|.$$
[I used the fact that $mathbb E[X] = mathbb E[Y] = 0$ here.]
So I'm afraid the correlation coefficient is not $rho$. It's ${rm sign}(rho) times |rho|^2$.
But that's easily fixed. If you redefine $mathbf M$ as $$ mathbf M := begin{bmatrix} sqrt{1- |rho |} & 0 & {rm sign}(rho)sqrt{ |rho |} \ 0 & sqrt{1- |rho | } & sqrt{|rho |} end{bmatrix},$$
then it should work out.
Finally, I'll point out that the $X$ and $Y$ that you've constructed are guaranteed to be Gaussian, since $mathbf V$ is related to the Gaussian vector $mathbf U$ by a linear transformation. They also both have unit variance, as you can see from the diagonal elements of $mathbb E[mathbf V mathbf V^T]$.
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Chris MacLellan is a new contributor. Be nice, and check out our Code of Conduct.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3063271%2fconstructing-a-bivariate-normal-from-three-univariate-normals%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
I presume $X_0, Y_0$ and $Z$ are assumed to be independent? Well, if we define
$$ mathbf U := begin{bmatrix} X_0 \ Y_0 \Z end{bmatrix}, mathbf V := begin{bmatrix} X \ Yend{bmatrix},$$
then
$$ mathbf V = mathbf M mathbf U,$$
where
$$ mathbf M := begin{bmatrix} sqrt{1- rho^2} & 0 & | rho| \ 0 & sqrt{1- rho^2} & rho end{bmatrix}$$
Since $X_0, Y_0$ and $Z$ are independent with unit variance, the covariance matrix for $mathbf U$ is the $3 times 3$ identity matrix:
$$mathbb E[mathbf U mathbf U^T] = mathbf I.$$
Hence the covariance matrix for $mathbf V$ is given by
$$ mathbb E [mathbf V mathbf V^T ] = mathbf M mathbb E[mathbf U mathbf U^T ] mathbf M^T = mathbf Mmathbf M^T=begin{bmatrix} 1 & rho | rho| \ rho | rho| & 1end{bmatrix}$$
Thus the correlation coefficient between $X$ and $Y$ is
$$ rho_{X,Y} = frac{mathbb E[XY]}{sqrt{mathbb E[X^2] mathbb E[{Y^2]}}} = frac{rho| rho|}{sqrt{1 times 1}} = rho|rho|.$$
[I used the fact that $mathbb E[X] = mathbb E[Y] = 0$ here.]
So I'm afraid the correlation coefficient is not $rho$. It's ${rm sign}(rho) times |rho|^2$.
But that's easily fixed. If you redefine $mathbf M$ as $$ mathbf M := begin{bmatrix} sqrt{1- |rho |} & 0 & {rm sign}(rho)sqrt{ |rho |} \ 0 & sqrt{1- |rho | } & sqrt{|rho |} end{bmatrix},$$
then it should work out.
Finally, I'll point out that the $X$ and $Y$ that you've constructed are guaranteed to be Gaussian, since $mathbf V$ is related to the Gaussian vector $mathbf U$ by a linear transformation. They also both have unit variance, as you can see from the diagonal elements of $mathbb E[mathbf V mathbf V^T]$.
add a comment |
I presume $X_0, Y_0$ and $Z$ are assumed to be independent? Well, if we define
$$ mathbf U := begin{bmatrix} X_0 \ Y_0 \Z end{bmatrix}, mathbf V := begin{bmatrix} X \ Yend{bmatrix},$$
then
$$ mathbf V = mathbf M mathbf U,$$
where
$$ mathbf M := begin{bmatrix} sqrt{1- rho^2} & 0 & | rho| \ 0 & sqrt{1- rho^2} & rho end{bmatrix}$$
Since $X_0, Y_0$ and $Z$ are independent with unit variance, the covariance matrix for $mathbf U$ is the $3 times 3$ identity matrix:
$$mathbb E[mathbf U mathbf U^T] = mathbf I.$$
Hence the covariance matrix for $mathbf V$ is given by
$$ mathbb E [mathbf V mathbf V^T ] = mathbf M mathbb E[mathbf U mathbf U^T ] mathbf M^T = mathbf Mmathbf M^T=begin{bmatrix} 1 & rho | rho| \ rho | rho| & 1end{bmatrix}$$
Thus the correlation coefficient between $X$ and $Y$ is
$$ rho_{X,Y} = frac{mathbb E[XY]}{sqrt{mathbb E[X^2] mathbb E[{Y^2]}}} = frac{rho| rho|}{sqrt{1 times 1}} = rho|rho|.$$
[I used the fact that $mathbb E[X] = mathbb E[Y] = 0$ here.]
So I'm afraid the correlation coefficient is not $rho$. It's ${rm sign}(rho) times |rho|^2$.
But that's easily fixed. If you redefine $mathbf M$ as $$ mathbf M := begin{bmatrix} sqrt{1- |rho |} & 0 & {rm sign}(rho)sqrt{ |rho |} \ 0 & sqrt{1- |rho | } & sqrt{|rho |} end{bmatrix},$$
then it should work out.
Finally, I'll point out that the $X$ and $Y$ that you've constructed are guaranteed to be Gaussian, since $mathbf V$ is related to the Gaussian vector $mathbf U$ by a linear transformation. They also both have unit variance, as you can see from the diagonal elements of $mathbb E[mathbf V mathbf V^T]$.
add a comment |
I presume $X_0, Y_0$ and $Z$ are assumed to be independent? Well, if we define
$$ mathbf U := begin{bmatrix} X_0 \ Y_0 \Z end{bmatrix}, mathbf V := begin{bmatrix} X \ Yend{bmatrix},$$
then
$$ mathbf V = mathbf M mathbf U,$$
where
$$ mathbf M := begin{bmatrix} sqrt{1- rho^2} & 0 & | rho| \ 0 & sqrt{1- rho^2} & rho end{bmatrix}$$
Since $X_0, Y_0$ and $Z$ are independent with unit variance, the covariance matrix for $mathbf U$ is the $3 times 3$ identity matrix:
$$mathbb E[mathbf U mathbf U^T] = mathbf I.$$
Hence the covariance matrix for $mathbf V$ is given by
$$ mathbb E [mathbf V mathbf V^T ] = mathbf M mathbb E[mathbf U mathbf U^T ] mathbf M^T = mathbf Mmathbf M^T=begin{bmatrix} 1 & rho | rho| \ rho | rho| & 1end{bmatrix}$$
Thus the correlation coefficient between $X$ and $Y$ is
$$ rho_{X,Y} = frac{mathbb E[XY]}{sqrt{mathbb E[X^2] mathbb E[{Y^2]}}} = frac{rho| rho|}{sqrt{1 times 1}} = rho|rho|.$$
[I used the fact that $mathbb E[X] = mathbb E[Y] = 0$ here.]
So I'm afraid the correlation coefficient is not $rho$. It's ${rm sign}(rho) times |rho|^2$.
But that's easily fixed. If you redefine $mathbf M$ as $$ mathbf M := begin{bmatrix} sqrt{1- |rho |} & 0 & {rm sign}(rho)sqrt{ |rho |} \ 0 & sqrt{1- |rho | } & sqrt{|rho |} end{bmatrix},$$
then it should work out.
Finally, I'll point out that the $X$ and $Y$ that you've constructed are guaranteed to be Gaussian, since $mathbf V$ is related to the Gaussian vector $mathbf U$ by a linear transformation. They also both have unit variance, as you can see from the diagonal elements of $mathbb E[mathbf V mathbf V^T]$.
I presume $X_0, Y_0$ and $Z$ are assumed to be independent? Well, if we define
$$ mathbf U := begin{bmatrix} X_0 \ Y_0 \Z end{bmatrix}, mathbf V := begin{bmatrix} X \ Yend{bmatrix},$$
then
$$ mathbf V = mathbf M mathbf U,$$
where
$$ mathbf M := begin{bmatrix} sqrt{1- rho^2} & 0 & | rho| \ 0 & sqrt{1- rho^2} & rho end{bmatrix}$$
Since $X_0, Y_0$ and $Z$ are independent with unit variance, the covariance matrix for $mathbf U$ is the $3 times 3$ identity matrix:
$$mathbb E[mathbf U mathbf U^T] = mathbf I.$$
Hence the covariance matrix for $mathbf V$ is given by
$$ mathbb E [mathbf V mathbf V^T ] = mathbf M mathbb E[mathbf U mathbf U^T ] mathbf M^T = mathbf Mmathbf M^T=begin{bmatrix} 1 & rho | rho| \ rho | rho| & 1end{bmatrix}$$
Thus the correlation coefficient between $X$ and $Y$ is
$$ rho_{X,Y} = frac{mathbb E[XY]}{sqrt{mathbb E[X^2] mathbb E[{Y^2]}}} = frac{rho| rho|}{sqrt{1 times 1}} = rho|rho|.$$
[I used the fact that $mathbb E[X] = mathbb E[Y] = 0$ here.]
So I'm afraid the correlation coefficient is not $rho$. It's ${rm sign}(rho) times |rho|^2$.
But that's easily fixed. If you redefine $mathbf M$ as $$ mathbf M := begin{bmatrix} sqrt{1- |rho |} & 0 & {rm sign}(rho)sqrt{ |rho |} \ 0 & sqrt{1- |rho | } & sqrt{|rho |} end{bmatrix},$$
then it should work out.
Finally, I'll point out that the $X$ and $Y$ that you've constructed are guaranteed to be Gaussian, since $mathbf V$ is related to the Gaussian vector $mathbf U$ by a linear transformation. They also both have unit variance, as you can see from the diagonal elements of $mathbb E[mathbf V mathbf V^T]$.
edited Jan 5 at 23:11
answered Jan 5 at 23:04
Kenny WongKenny Wong
18.3k21438
18.3k21438
add a comment |
add a comment |
Chris MacLellan is a new contributor. Be nice, and check out our Code of Conduct.
Chris MacLellan is a new contributor. Be nice, and check out our Code of Conduct.
Chris MacLellan is a new contributor. Be nice, and check out our Code of Conduct.
Chris MacLellan is a new contributor. Be nice, and check out our Code of Conduct.
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Some of your past answers have not been well-received, and you're in danger of being blocked from answering.
Please pay close attention to the following guidance:
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3063271%2fconstructing-a-bivariate-normal-from-three-univariate-normals%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Just to clarify based on the below solution. $X_0$, $Y_0$, and $Z$ are independent.
– Chris MacLellan
Jan 5 at 23:24