How to find orthogonal eigenvectors if some of the eigenvalues are the same?
I have an example:
$$A=begin{pmatrix} 2 & 2 & 4 \ 2 & 5 & 8 \ 4 & 8 & 17 end{pmatrix}$$
The eigenvalue I found is $lambda_1=lambda_2=1$ and $lambda_3=22$.
For $lambda=1$,
$$begin{pmatrix} x\ y \ z end{pmatrix}=begin{pmatrix} -2\ 1 \ 0 end{pmatrix}y+begin{pmatrix} -4\ 0 \ 1 end{pmatrix}z$$
For $lambda=22$,
$$begin{pmatrix} x\ y \ z end{pmatrix}=begin{pmatrix} 1/4\ 1/2 \ 1 end{pmatrix}z$$
However, those eigenvectors I found are not orthogonal to each other. The goal is to find an orthogonal matrix P and diagonal matrix Q so that $A=PQP^T$.
linear-algebra eigenvalues-eigenvectors
add a comment |
I have an example:
$$A=begin{pmatrix} 2 & 2 & 4 \ 2 & 5 & 8 \ 4 & 8 & 17 end{pmatrix}$$
The eigenvalue I found is $lambda_1=lambda_2=1$ and $lambda_3=22$.
For $lambda=1$,
$$begin{pmatrix} x\ y \ z end{pmatrix}=begin{pmatrix} -2\ 1 \ 0 end{pmatrix}y+begin{pmatrix} -4\ 0 \ 1 end{pmatrix}z$$
For $lambda=22$,
$$begin{pmatrix} x\ y \ z end{pmatrix}=begin{pmatrix} 1/4\ 1/2 \ 1 end{pmatrix}z$$
However, those eigenvectors I found are not orthogonal to each other. The goal is to find an orthogonal matrix P and diagonal matrix Q so that $A=PQP^T$.
linear-algebra eigenvalues-eigenvectors
1
Not every matrix is diagonalizable (I'm responding to your last sentence, last paragraph).
– stressed out
yesterday
@stressedout Yes, I do know that. I mean in this problem I need to find the corresponding P and Q matrix
– Yibei He
yesterday
@stressed out: This is a real symmetric matrix. Those are always diagonalizable, and we can always choose orthogonal eigenvectors.
– jmerry
yesterday
@jmerry That's right. I didn't check the matrix to see that it's symmetric.
– stressed out
yesterday
Here's a possible solution: $A$ is symmetric and you have two distinct eigenvalues. So, you get two orthogonal eigenvectors. Since your vectors are $3$-dimensional, get the third one using cross-product.
– stressed out
yesterday
add a comment |
I have an example:
$$A=begin{pmatrix} 2 & 2 & 4 \ 2 & 5 & 8 \ 4 & 8 & 17 end{pmatrix}$$
The eigenvalue I found is $lambda_1=lambda_2=1$ and $lambda_3=22$.
For $lambda=1$,
$$begin{pmatrix} x\ y \ z end{pmatrix}=begin{pmatrix} -2\ 1 \ 0 end{pmatrix}y+begin{pmatrix} -4\ 0 \ 1 end{pmatrix}z$$
For $lambda=22$,
$$begin{pmatrix} x\ y \ z end{pmatrix}=begin{pmatrix} 1/4\ 1/2 \ 1 end{pmatrix}z$$
However, those eigenvectors I found are not orthogonal to each other. The goal is to find an orthogonal matrix P and diagonal matrix Q so that $A=PQP^T$.
linear-algebra eigenvalues-eigenvectors
I have an example:
$$A=begin{pmatrix} 2 & 2 & 4 \ 2 & 5 & 8 \ 4 & 8 & 17 end{pmatrix}$$
The eigenvalue I found is $lambda_1=lambda_2=1$ and $lambda_3=22$.
For $lambda=1$,
$$begin{pmatrix} x\ y \ z end{pmatrix}=begin{pmatrix} -2\ 1 \ 0 end{pmatrix}y+begin{pmatrix} -4\ 0 \ 1 end{pmatrix}z$$
For $lambda=22$,
$$begin{pmatrix} x\ y \ z end{pmatrix}=begin{pmatrix} 1/4\ 1/2 \ 1 end{pmatrix}z$$
However, those eigenvectors I found are not orthogonal to each other. The goal is to find an orthogonal matrix P and diagonal matrix Q so that $A=PQP^T$.
linear-algebra eigenvalues-eigenvectors
linear-algebra eigenvalues-eigenvectors
asked yesterday
Yibei He
1148
1148
1
Not every matrix is diagonalizable (I'm responding to your last sentence, last paragraph).
– stressed out
yesterday
@stressedout Yes, I do know that. I mean in this problem I need to find the corresponding P and Q matrix
– Yibei He
yesterday
@stressed out: This is a real symmetric matrix. Those are always diagonalizable, and we can always choose orthogonal eigenvectors.
– jmerry
yesterday
@jmerry That's right. I didn't check the matrix to see that it's symmetric.
– stressed out
yesterday
Here's a possible solution: $A$ is symmetric and you have two distinct eigenvalues. So, you get two orthogonal eigenvectors. Since your vectors are $3$-dimensional, get the third one using cross-product.
– stressed out
yesterday
add a comment |
1
Not every matrix is diagonalizable (I'm responding to your last sentence, last paragraph).
– stressed out
yesterday
@stressedout Yes, I do know that. I mean in this problem I need to find the corresponding P and Q matrix
– Yibei He
yesterday
@stressed out: This is a real symmetric matrix. Those are always diagonalizable, and we can always choose orthogonal eigenvectors.
– jmerry
yesterday
@jmerry That's right. I didn't check the matrix to see that it's symmetric.
– stressed out
yesterday
Here's a possible solution: $A$ is symmetric and you have two distinct eigenvalues. So, you get two orthogonal eigenvectors. Since your vectors are $3$-dimensional, get the third one using cross-product.
– stressed out
yesterday
1
1
Not every matrix is diagonalizable (I'm responding to your last sentence, last paragraph).
– stressed out
yesterday
Not every matrix is diagonalizable (I'm responding to your last sentence, last paragraph).
– stressed out
yesterday
@stressedout Yes, I do know that. I mean in this problem I need to find the corresponding P and Q matrix
– Yibei He
yesterday
@stressedout Yes, I do know that. I mean in this problem I need to find the corresponding P and Q matrix
– Yibei He
yesterday
@stressed out: This is a real symmetric matrix. Those are always diagonalizable, and we can always choose orthogonal eigenvectors.
– jmerry
yesterday
@stressed out: This is a real symmetric matrix. Those are always diagonalizable, and we can always choose orthogonal eigenvectors.
– jmerry
yesterday
@jmerry That's right. I didn't check the matrix to see that it's symmetric.
– stressed out
yesterday
@jmerry That's right. I didn't check the matrix to see that it's symmetric.
– stressed out
yesterday
Here's a possible solution: $A$ is symmetric and you have two distinct eigenvalues. So, you get two orthogonal eigenvectors. Since your vectors are $3$-dimensional, get the third one using cross-product.
– stressed out
yesterday
Here's a possible solution: $A$ is symmetric and you have two distinct eigenvalues. So, you get two orthogonal eigenvectors. Since your vectors are $3$-dimensional, get the third one using cross-product.
– stressed out
yesterday
add a comment |
3 Answers
3
active
oldest
votes
One thing we know is that eigenvectors of a symmetric matrix corresponding to distinct eigenvalues are orthogonal. So, if we find eigenvectors $v_1,v_2,v_3$ for $lambda_1< lambda_2< lambda_3$ we are done. On the other hand, we have eigenvalues $lambda_1=lambda_2=1$ and $lambda_3=22$, so that there are not $3$ distinct eigenvalues and the situation becomes somewhat more complicated.
Suppose we found $v_1,v_2in E(A,lambda_1)$ which are linearly independent (and hence a basis for the Eigenspace). We know that $v_1perp v_3$ and $v_2perp v_3$. This means $langle v_1,v_3rangle=langle v_2,v_3rangle=0$. By bilinearity of the inner product, we get that $langle av_1+bv_2,v_3rangle =0$ for all $a,bin mathbb{R}$. The upshot is that the entire eigenspace $E(A,lambda_1)$ is orthogonal to $v_3$. So, we are free to choose any basis of eigenvectors for $E(A,lambda_1)$ and proceed from there. Well, just apply Gram-Schmidt to $v_1,v_2$. Define
$$ u_1=frac{v_1}{lVert v_1rVert}$$
$$ u_2=frac{v_2-langle v_2, u_1rangle u_1}{lVert v_2-langle v_2, u_1rangle u_1rVert}.$$
A quick check shows that these two vectors form an orthonormal basis for $E(A,lambda_1)$. Then, if we take any nonzero $v_3in E(A,lambda_3)$ and set
$$ u_3=frac{v_3}{lVert v_3rVert}$$
we can see that $(u_1,u_2,u_3)$is an orthonormal eigenbasis of $mathbb{R}^3cong E(lambda_1,A)oplus E(lambda_3,A)$ with respect to $A$. You've already found the vectors $v_1,v_2,v_3$. Once you compute $u_1,u_2,u_3$, the matrix $P=[u_1,u_2,u_3]$ is orthogonal and
$$
A=P^T
begin{bmatrix}
1&0&0\
0&1&0\
0&0&22
end{bmatrix}
P.
$$
add a comment |
We know that the eigenvectors corresponding to different eigenvalues of a symmetric matrix are orthogonal. You have two different eigenvalues, hence you have two orthogonal eigenvectors $v_1$ and $v_2$. Since your matrix is $3times 3$, the third vector to form $P=[v_1 | v_2 |v_3]$ has to be $v_3=pm v_1times v_2$. It is easy to see that $PP^T=I$.
Now just take $Q=mathrm{diag}(lambda_1,lambda_2,lambda_3)$ and solve $A=PQP^T$ to determine $Q$ completely and then you're done.
add a comment |
How about Gram-Schmidt? Since the eigenspace is $2$-dimensional, there are certainly $2$ such.
Project and subtract: $(-4,0,1)-8frac15(-2,1,0)= (-frac45,-frac85,1)$.
Now normalize: $frac5{23}(-frac45,-frac85,1)=(-frac4{23},-frac8{23},frac5{23}):=b_1$. And $(-frac2{sqrt5},frac1{sqrt5},0):=b_2$.
Finally, normalize the eigenvector for $lambda =22$:
$frac{16}{21}(frac14,frac12,1)=(frac4{21},frac8{21},frac{16}{21}):=b_3$. Conveniently, this one is orthogonal to the others by symmetry of the matrix.
(Alternatively, the cross-product would have been a good way to do this as well.)
Finally, the matrix $P$ whose columns are the basis vectors, $b_1,b_2,b_3$, above will do the trick: $P^tAP=begin{pmatrix}1&0&0\0&1&0\0&0&22end{pmatrix}$.
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3062424%2fhow-to-find-orthogonal-eigenvectors-if-some-of-the-eigenvalues-are-the-same%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
3 Answers
3
active
oldest
votes
3 Answers
3
active
oldest
votes
active
oldest
votes
active
oldest
votes
One thing we know is that eigenvectors of a symmetric matrix corresponding to distinct eigenvalues are orthogonal. So, if we find eigenvectors $v_1,v_2,v_3$ for $lambda_1< lambda_2< lambda_3$ we are done. On the other hand, we have eigenvalues $lambda_1=lambda_2=1$ and $lambda_3=22$, so that there are not $3$ distinct eigenvalues and the situation becomes somewhat more complicated.
Suppose we found $v_1,v_2in E(A,lambda_1)$ which are linearly independent (and hence a basis for the Eigenspace). We know that $v_1perp v_3$ and $v_2perp v_3$. This means $langle v_1,v_3rangle=langle v_2,v_3rangle=0$. By bilinearity of the inner product, we get that $langle av_1+bv_2,v_3rangle =0$ for all $a,bin mathbb{R}$. The upshot is that the entire eigenspace $E(A,lambda_1)$ is orthogonal to $v_3$. So, we are free to choose any basis of eigenvectors for $E(A,lambda_1)$ and proceed from there. Well, just apply Gram-Schmidt to $v_1,v_2$. Define
$$ u_1=frac{v_1}{lVert v_1rVert}$$
$$ u_2=frac{v_2-langle v_2, u_1rangle u_1}{lVert v_2-langle v_2, u_1rangle u_1rVert}.$$
A quick check shows that these two vectors form an orthonormal basis for $E(A,lambda_1)$. Then, if we take any nonzero $v_3in E(A,lambda_3)$ and set
$$ u_3=frac{v_3}{lVert v_3rVert}$$
we can see that $(u_1,u_2,u_3)$is an orthonormal eigenbasis of $mathbb{R}^3cong E(lambda_1,A)oplus E(lambda_3,A)$ with respect to $A$. You've already found the vectors $v_1,v_2,v_3$. Once you compute $u_1,u_2,u_3$, the matrix $P=[u_1,u_2,u_3]$ is orthogonal and
$$
A=P^T
begin{bmatrix}
1&0&0\
0&1&0\
0&0&22
end{bmatrix}
P.
$$
add a comment |
One thing we know is that eigenvectors of a symmetric matrix corresponding to distinct eigenvalues are orthogonal. So, if we find eigenvectors $v_1,v_2,v_3$ for $lambda_1< lambda_2< lambda_3$ we are done. On the other hand, we have eigenvalues $lambda_1=lambda_2=1$ and $lambda_3=22$, so that there are not $3$ distinct eigenvalues and the situation becomes somewhat more complicated.
Suppose we found $v_1,v_2in E(A,lambda_1)$ which are linearly independent (and hence a basis for the Eigenspace). We know that $v_1perp v_3$ and $v_2perp v_3$. This means $langle v_1,v_3rangle=langle v_2,v_3rangle=0$. By bilinearity of the inner product, we get that $langle av_1+bv_2,v_3rangle =0$ for all $a,bin mathbb{R}$. The upshot is that the entire eigenspace $E(A,lambda_1)$ is orthogonal to $v_3$. So, we are free to choose any basis of eigenvectors for $E(A,lambda_1)$ and proceed from there. Well, just apply Gram-Schmidt to $v_1,v_2$. Define
$$ u_1=frac{v_1}{lVert v_1rVert}$$
$$ u_2=frac{v_2-langle v_2, u_1rangle u_1}{lVert v_2-langle v_2, u_1rangle u_1rVert}.$$
A quick check shows that these two vectors form an orthonormal basis for $E(A,lambda_1)$. Then, if we take any nonzero $v_3in E(A,lambda_3)$ and set
$$ u_3=frac{v_3}{lVert v_3rVert}$$
we can see that $(u_1,u_2,u_3)$is an orthonormal eigenbasis of $mathbb{R}^3cong E(lambda_1,A)oplus E(lambda_3,A)$ with respect to $A$. You've already found the vectors $v_1,v_2,v_3$. Once you compute $u_1,u_2,u_3$, the matrix $P=[u_1,u_2,u_3]$ is orthogonal and
$$
A=P^T
begin{bmatrix}
1&0&0\
0&1&0\
0&0&22
end{bmatrix}
P.
$$
add a comment |
One thing we know is that eigenvectors of a symmetric matrix corresponding to distinct eigenvalues are orthogonal. So, if we find eigenvectors $v_1,v_2,v_3$ for $lambda_1< lambda_2< lambda_3$ we are done. On the other hand, we have eigenvalues $lambda_1=lambda_2=1$ and $lambda_3=22$, so that there are not $3$ distinct eigenvalues and the situation becomes somewhat more complicated.
Suppose we found $v_1,v_2in E(A,lambda_1)$ which are linearly independent (and hence a basis for the Eigenspace). We know that $v_1perp v_3$ and $v_2perp v_3$. This means $langle v_1,v_3rangle=langle v_2,v_3rangle=0$. By bilinearity of the inner product, we get that $langle av_1+bv_2,v_3rangle =0$ for all $a,bin mathbb{R}$. The upshot is that the entire eigenspace $E(A,lambda_1)$ is orthogonal to $v_3$. So, we are free to choose any basis of eigenvectors for $E(A,lambda_1)$ and proceed from there. Well, just apply Gram-Schmidt to $v_1,v_2$. Define
$$ u_1=frac{v_1}{lVert v_1rVert}$$
$$ u_2=frac{v_2-langle v_2, u_1rangle u_1}{lVert v_2-langle v_2, u_1rangle u_1rVert}.$$
A quick check shows that these two vectors form an orthonormal basis for $E(A,lambda_1)$. Then, if we take any nonzero $v_3in E(A,lambda_3)$ and set
$$ u_3=frac{v_3}{lVert v_3rVert}$$
we can see that $(u_1,u_2,u_3)$is an orthonormal eigenbasis of $mathbb{R}^3cong E(lambda_1,A)oplus E(lambda_3,A)$ with respect to $A$. You've already found the vectors $v_1,v_2,v_3$. Once you compute $u_1,u_2,u_3$, the matrix $P=[u_1,u_2,u_3]$ is orthogonal and
$$
A=P^T
begin{bmatrix}
1&0&0\
0&1&0\
0&0&22
end{bmatrix}
P.
$$
One thing we know is that eigenvectors of a symmetric matrix corresponding to distinct eigenvalues are orthogonal. So, if we find eigenvectors $v_1,v_2,v_3$ for $lambda_1< lambda_2< lambda_3$ we are done. On the other hand, we have eigenvalues $lambda_1=lambda_2=1$ and $lambda_3=22$, so that there are not $3$ distinct eigenvalues and the situation becomes somewhat more complicated.
Suppose we found $v_1,v_2in E(A,lambda_1)$ which are linearly independent (and hence a basis for the Eigenspace). We know that $v_1perp v_3$ and $v_2perp v_3$. This means $langle v_1,v_3rangle=langle v_2,v_3rangle=0$. By bilinearity of the inner product, we get that $langle av_1+bv_2,v_3rangle =0$ for all $a,bin mathbb{R}$. The upshot is that the entire eigenspace $E(A,lambda_1)$ is orthogonal to $v_3$. So, we are free to choose any basis of eigenvectors for $E(A,lambda_1)$ and proceed from there. Well, just apply Gram-Schmidt to $v_1,v_2$. Define
$$ u_1=frac{v_1}{lVert v_1rVert}$$
$$ u_2=frac{v_2-langle v_2, u_1rangle u_1}{lVert v_2-langle v_2, u_1rangle u_1rVert}.$$
A quick check shows that these two vectors form an orthonormal basis for $E(A,lambda_1)$. Then, if we take any nonzero $v_3in E(A,lambda_3)$ and set
$$ u_3=frac{v_3}{lVert v_3rVert}$$
we can see that $(u_1,u_2,u_3)$is an orthonormal eigenbasis of $mathbb{R}^3cong E(lambda_1,A)oplus E(lambda_3,A)$ with respect to $A$. You've already found the vectors $v_1,v_2,v_3$. Once you compute $u_1,u_2,u_3$, the matrix $P=[u_1,u_2,u_3]$ is orthogonal and
$$
A=P^T
begin{bmatrix}
1&0&0\
0&1&0\
0&0&22
end{bmatrix}
P.
$$
answered yesterday
Antonios-Alexandros Robotis
9,56241640
9,56241640
add a comment |
add a comment |
We know that the eigenvectors corresponding to different eigenvalues of a symmetric matrix are orthogonal. You have two different eigenvalues, hence you have two orthogonal eigenvectors $v_1$ and $v_2$. Since your matrix is $3times 3$, the third vector to form $P=[v_1 | v_2 |v_3]$ has to be $v_3=pm v_1times v_2$. It is easy to see that $PP^T=I$.
Now just take $Q=mathrm{diag}(lambda_1,lambda_2,lambda_3)$ and solve $A=PQP^T$ to determine $Q$ completely and then you're done.
add a comment |
We know that the eigenvectors corresponding to different eigenvalues of a symmetric matrix are orthogonal. You have two different eigenvalues, hence you have two orthogonal eigenvectors $v_1$ and $v_2$. Since your matrix is $3times 3$, the third vector to form $P=[v_1 | v_2 |v_3]$ has to be $v_3=pm v_1times v_2$. It is easy to see that $PP^T=I$.
Now just take $Q=mathrm{diag}(lambda_1,lambda_2,lambda_3)$ and solve $A=PQP^T$ to determine $Q$ completely and then you're done.
add a comment |
We know that the eigenvectors corresponding to different eigenvalues of a symmetric matrix are orthogonal. You have two different eigenvalues, hence you have two orthogonal eigenvectors $v_1$ and $v_2$. Since your matrix is $3times 3$, the third vector to form $P=[v_1 | v_2 |v_3]$ has to be $v_3=pm v_1times v_2$. It is easy to see that $PP^T=I$.
Now just take $Q=mathrm{diag}(lambda_1,lambda_2,lambda_3)$ and solve $A=PQP^T$ to determine $Q$ completely and then you're done.
We know that the eigenvectors corresponding to different eigenvalues of a symmetric matrix are orthogonal. You have two different eigenvalues, hence you have two orthogonal eigenvectors $v_1$ and $v_2$. Since your matrix is $3times 3$, the third vector to form $P=[v_1 | v_2 |v_3]$ has to be $v_3=pm v_1times v_2$. It is easy to see that $PP^T=I$.
Now just take $Q=mathrm{diag}(lambda_1,lambda_2,lambda_3)$ and solve $A=PQP^T$ to determine $Q$ completely and then you're done.
answered yesterday
stressed out
3,9941533
3,9941533
add a comment |
add a comment |
How about Gram-Schmidt? Since the eigenspace is $2$-dimensional, there are certainly $2$ such.
Project and subtract: $(-4,0,1)-8frac15(-2,1,0)= (-frac45,-frac85,1)$.
Now normalize: $frac5{23}(-frac45,-frac85,1)=(-frac4{23},-frac8{23},frac5{23}):=b_1$. And $(-frac2{sqrt5},frac1{sqrt5},0):=b_2$.
Finally, normalize the eigenvector for $lambda =22$:
$frac{16}{21}(frac14,frac12,1)=(frac4{21},frac8{21},frac{16}{21}):=b_3$. Conveniently, this one is orthogonal to the others by symmetry of the matrix.
(Alternatively, the cross-product would have been a good way to do this as well.)
Finally, the matrix $P$ whose columns are the basis vectors, $b_1,b_2,b_3$, above will do the trick: $P^tAP=begin{pmatrix}1&0&0\0&1&0\0&0&22end{pmatrix}$.
add a comment |
How about Gram-Schmidt? Since the eigenspace is $2$-dimensional, there are certainly $2$ such.
Project and subtract: $(-4,0,1)-8frac15(-2,1,0)= (-frac45,-frac85,1)$.
Now normalize: $frac5{23}(-frac45,-frac85,1)=(-frac4{23},-frac8{23},frac5{23}):=b_1$. And $(-frac2{sqrt5},frac1{sqrt5},0):=b_2$.
Finally, normalize the eigenvector for $lambda =22$:
$frac{16}{21}(frac14,frac12,1)=(frac4{21},frac8{21},frac{16}{21}):=b_3$. Conveniently, this one is orthogonal to the others by symmetry of the matrix.
(Alternatively, the cross-product would have been a good way to do this as well.)
Finally, the matrix $P$ whose columns are the basis vectors, $b_1,b_2,b_3$, above will do the trick: $P^tAP=begin{pmatrix}1&0&0\0&1&0\0&0&22end{pmatrix}$.
add a comment |
How about Gram-Schmidt? Since the eigenspace is $2$-dimensional, there are certainly $2$ such.
Project and subtract: $(-4,0,1)-8frac15(-2,1,0)= (-frac45,-frac85,1)$.
Now normalize: $frac5{23}(-frac45,-frac85,1)=(-frac4{23},-frac8{23},frac5{23}):=b_1$. And $(-frac2{sqrt5},frac1{sqrt5},0):=b_2$.
Finally, normalize the eigenvector for $lambda =22$:
$frac{16}{21}(frac14,frac12,1)=(frac4{21},frac8{21},frac{16}{21}):=b_3$. Conveniently, this one is orthogonal to the others by symmetry of the matrix.
(Alternatively, the cross-product would have been a good way to do this as well.)
Finally, the matrix $P$ whose columns are the basis vectors, $b_1,b_2,b_3$, above will do the trick: $P^tAP=begin{pmatrix}1&0&0\0&1&0\0&0&22end{pmatrix}$.
How about Gram-Schmidt? Since the eigenspace is $2$-dimensional, there are certainly $2$ such.
Project and subtract: $(-4,0,1)-8frac15(-2,1,0)= (-frac45,-frac85,1)$.
Now normalize: $frac5{23}(-frac45,-frac85,1)=(-frac4{23},-frac8{23},frac5{23}):=b_1$. And $(-frac2{sqrt5},frac1{sqrt5},0):=b_2$.
Finally, normalize the eigenvector for $lambda =22$:
$frac{16}{21}(frac14,frac12,1)=(frac4{21},frac8{21},frac{16}{21}):=b_3$. Conveniently, this one is orthogonal to the others by symmetry of the matrix.
(Alternatively, the cross-product would have been a good way to do this as well.)
Finally, the matrix $P$ whose columns are the basis vectors, $b_1,b_2,b_3$, above will do the trick: $P^tAP=begin{pmatrix}1&0&0\0&1&0\0&0&22end{pmatrix}$.
edited yesterday
answered yesterday
Chris Custer
10.9k3824
10.9k3824
add a comment |
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Some of your past answers have not been well-received, and you're in danger of being blocked from answering.
Please pay close attention to the following guidance:
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3062424%2fhow-to-find-orthogonal-eigenvectors-if-some-of-the-eigenvalues-are-the-same%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
1
Not every matrix is diagonalizable (I'm responding to your last sentence, last paragraph).
– stressed out
yesterday
@stressedout Yes, I do know that. I mean in this problem I need to find the corresponding P and Q matrix
– Yibei He
yesterday
@stressed out: This is a real symmetric matrix. Those are always diagonalizable, and we can always choose orthogonal eigenvectors.
– jmerry
yesterday
@jmerry That's right. I didn't check the matrix to see that it's symmetric.
– stressed out
yesterday
Here's a possible solution: $A$ is symmetric and you have two distinct eigenvalues. So, you get two orthogonal eigenvectors. Since your vectors are $3$-dimensional, get the third one using cross-product.
– stressed out
yesterday