trace of symmetric matrix problems

Multi tool use
$begingroup$
I have the two problems below from a practice exam. I can prove them on their own but am not exactly sure if/how to show that they only hold for symmetric matrices and for '3)' showing that it only holds for a matrix with only positive eigenvalues. I know that if the eigenvalues are all positive the determinant will be positive and the trace but cant see how that affects whether '3)' is true or not.
- Show that $operatorname{Tr}(A^2) leq operatorname{Tr}(A)^2$ holds for any symmetric matrix $A$ whose eigenvalues are all non-negative.
- Show that $operatorname{Tr}(AB)^2 le operatorname{Tr}(A^2)operatorname{Tr}(B^2)$ holds for any symmetric matrices $A$ and $B$.
linear-algebra matrices trace
$endgroup$
add a comment |
$begingroup$
I have the two problems below from a practice exam. I can prove them on their own but am not exactly sure if/how to show that they only hold for symmetric matrices and for '3)' showing that it only holds for a matrix with only positive eigenvalues. I know that if the eigenvalues are all positive the determinant will be positive and the trace but cant see how that affects whether '3)' is true or not.
- Show that $operatorname{Tr}(A^2) leq operatorname{Tr}(A)^2$ holds for any symmetric matrix $A$ whose eigenvalues are all non-negative.
- Show that $operatorname{Tr}(AB)^2 le operatorname{Tr}(A^2)operatorname{Tr}(B^2)$ holds for any symmetric matrices $A$ and $B$.
linear-algebra matrices trace
$endgroup$
1
$begingroup$
If $lambda_1, ldots, lambda_n$ are the eigenvalues of $A$, what are the eigenvalues of $A^2$? And how the trace of a matrix is related to the eigenvalues?
$endgroup$
– thanasissdr
Oct 28 '15 at 9:08
$begingroup$
for both questions i just worked through an example which proved the questions right. I understand the eigenvalues for $A^2$ are the same as for $A$ but squared and when you add the trace you should get the same value as the eigenvalues added. I just wasn't sure in the second question as it says about all non-negative eigenvalues if just working through an example that proves it right really proves it for all cases and therefore had to go about it a different way
$endgroup$
– dmnte
Oct 28 '15 at 12:25
$begingroup$
for the fourth question I cant really prove it as much as I just show that it is true through a worked example, im guessing this is wrong but im not sure how else to do it
$endgroup$
– dmnte
Oct 28 '15 at 14:49
add a comment |
$begingroup$
I have the two problems below from a practice exam. I can prove them on their own but am not exactly sure if/how to show that they only hold for symmetric matrices and for '3)' showing that it only holds for a matrix with only positive eigenvalues. I know that if the eigenvalues are all positive the determinant will be positive and the trace but cant see how that affects whether '3)' is true or not.
- Show that $operatorname{Tr}(A^2) leq operatorname{Tr}(A)^2$ holds for any symmetric matrix $A$ whose eigenvalues are all non-negative.
- Show that $operatorname{Tr}(AB)^2 le operatorname{Tr}(A^2)operatorname{Tr}(B^2)$ holds for any symmetric matrices $A$ and $B$.
linear-algebra matrices trace
$endgroup$
I have the two problems below from a practice exam. I can prove them on their own but am not exactly sure if/how to show that they only hold for symmetric matrices and for '3)' showing that it only holds for a matrix with only positive eigenvalues. I know that if the eigenvalues are all positive the determinant will be positive and the trace but cant see how that affects whether '3)' is true or not.
- Show that $operatorname{Tr}(A^2) leq operatorname{Tr}(A)^2$ holds for any symmetric matrix $A$ whose eigenvalues are all non-negative.
- Show that $operatorname{Tr}(AB)^2 le operatorname{Tr}(A^2)operatorname{Tr}(B^2)$ holds for any symmetric matrices $A$ and $B$.
linear-algebra matrices trace
linear-algebra matrices trace
edited Oct 29 '15 at 7:14


thanasissdr
5,55111325
5,55111325
asked Oct 28 '15 at 9:03


dmntedmnte
1129
1129
1
$begingroup$
If $lambda_1, ldots, lambda_n$ are the eigenvalues of $A$, what are the eigenvalues of $A^2$? And how the trace of a matrix is related to the eigenvalues?
$endgroup$
– thanasissdr
Oct 28 '15 at 9:08
$begingroup$
for both questions i just worked through an example which proved the questions right. I understand the eigenvalues for $A^2$ are the same as for $A$ but squared and when you add the trace you should get the same value as the eigenvalues added. I just wasn't sure in the second question as it says about all non-negative eigenvalues if just working through an example that proves it right really proves it for all cases and therefore had to go about it a different way
$endgroup$
– dmnte
Oct 28 '15 at 12:25
$begingroup$
for the fourth question I cant really prove it as much as I just show that it is true through a worked example, im guessing this is wrong but im not sure how else to do it
$endgroup$
– dmnte
Oct 28 '15 at 14:49
add a comment |
1
$begingroup$
If $lambda_1, ldots, lambda_n$ are the eigenvalues of $A$, what are the eigenvalues of $A^2$? And how the trace of a matrix is related to the eigenvalues?
$endgroup$
– thanasissdr
Oct 28 '15 at 9:08
$begingroup$
for both questions i just worked through an example which proved the questions right. I understand the eigenvalues for $A^2$ are the same as for $A$ but squared and when you add the trace you should get the same value as the eigenvalues added. I just wasn't sure in the second question as it says about all non-negative eigenvalues if just working through an example that proves it right really proves it for all cases and therefore had to go about it a different way
$endgroup$
– dmnte
Oct 28 '15 at 12:25
$begingroup$
for the fourth question I cant really prove it as much as I just show that it is true through a worked example, im guessing this is wrong but im not sure how else to do it
$endgroup$
– dmnte
Oct 28 '15 at 14:49
1
1
$begingroup$
If $lambda_1, ldots, lambda_n$ are the eigenvalues of $A$, what are the eigenvalues of $A^2$? And how the trace of a matrix is related to the eigenvalues?
$endgroup$
– thanasissdr
Oct 28 '15 at 9:08
$begingroup$
If $lambda_1, ldots, lambda_n$ are the eigenvalues of $A$, what are the eigenvalues of $A^2$? And how the trace of a matrix is related to the eigenvalues?
$endgroup$
– thanasissdr
Oct 28 '15 at 9:08
$begingroup$
for both questions i just worked through an example which proved the questions right. I understand the eigenvalues for $A^2$ are the same as for $A$ but squared and when you add the trace you should get the same value as the eigenvalues added. I just wasn't sure in the second question as it says about all non-negative eigenvalues if just working through an example that proves it right really proves it for all cases and therefore had to go about it a different way
$endgroup$
– dmnte
Oct 28 '15 at 12:25
$begingroup$
for both questions i just worked through an example which proved the questions right. I understand the eigenvalues for $A^2$ are the same as for $A$ but squared and when you add the trace you should get the same value as the eigenvalues added. I just wasn't sure in the second question as it says about all non-negative eigenvalues if just working through an example that proves it right really proves it for all cases and therefore had to go about it a different way
$endgroup$
– dmnte
Oct 28 '15 at 12:25
$begingroup$
for the fourth question I cant really prove it as much as I just show that it is true through a worked example, im guessing this is wrong but im not sure how else to do it
$endgroup$
– dmnte
Oct 28 '15 at 14:49
$begingroup$
for the fourth question I cant really prove it as much as I just show that it is true through a worked example, im guessing this is wrong but im not sure how else to do it
$endgroup$
– dmnte
Oct 28 '15 at 14:49
add a comment |
2 Answers
2
active
oldest
votes
$begingroup$
About question $#4$:
Notation: Let $C= AB$ and $K^2_{ii}$ denote the element at the $(i,i)$ position of the matrix $K^2$.
Firstly, due to symmetry of the $ntimes n$ matrices $A,B$, it is easy to prove that:
$$c_{ii} le left(A^2_{ii}right)^{1/2} cdot left(B^2_{ii}right)^{1/2},quad i = 1,ldots, n.tag 1$$
Proof of $(1)$
$c_{ii} =^color{red}{starstar} a_{i1} cdot b_{i1} + cdots +a_{in}cdot b_{in}= sumlimits_{j=1}^{n}a_{ij}cdot b_{ij}color{blue}{le^star} left(sumlimits_{j=1}^{n}a^2_{ij}right)^{1/2}cdot left(sumlimits_{j=1}^nb_{ij}^2right)^{1/2}=left(A^2_{ii}right)^{1/2}cdot left(B^2_{ii}right)^{1/2}quad text{QED}$
Thus, we have:
$$big[operatorname{trace} (AB)big]^2=left(sum_{i=1}^n c_{ii}right)^2leleft[ sum_{i=1}^n left(A^2_{ii}right)^{1/2}cdot left(B^2_{ii}right)^{1/2}right]^2color{blue}{le^star}sum_{i=1}^n A^2_{ii} cdot sum_{i=1}^n B^2_{ii}=operatorname{trace} (A^2) cdot operatorname{trace} (B^2)$$
$^color{blue}{star}$ We have applied the Cauchy-Schwarz inequality.
$^color{red}{starstar}$ Normally, it is:
$c_{ii} = a_{i1}b_{1i} + a_{i2}b_{2i} +cdots + a_{in}b_{ni},$
but notice that $b_{kell} = b_{ell k}$.
$endgroup$
add a comment |
$begingroup$
Question No.3 is more related to the fact that for given numbers $x_1,dots,x_n$, the following inequality $$x_1^2+dots+x_n^2leq ,(x_1+dots+x_n)^2$$ holds only if $x_i$ are non-negative. Let $A$ be any diagonalizable matrix so that $A=TLambda T^{-1}$ and $A^2=TLambda^2 T^{-1}$. Thus, if $x_1,dots,x_n$ are the eigenvalues, then $mathrm{trace}(A^2)=x_1^2+dots+x_n^2 $ and $mathrm{trace}(A)^2=(x_1+dots+x_n)^2 $. Note that symmetric matrices are readily diagonalizable since they are normal.
Question No.4 is more related to the fact that trace is an inner product in the space of symmetric matrices. In fact, that inequality you have given is Cauchy-Schwartz indeed.
$endgroup$
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f1501499%2ftrace-of-symmetric-matrix-problems%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
About question $#4$:
Notation: Let $C= AB$ and $K^2_{ii}$ denote the element at the $(i,i)$ position of the matrix $K^2$.
Firstly, due to symmetry of the $ntimes n$ matrices $A,B$, it is easy to prove that:
$$c_{ii} le left(A^2_{ii}right)^{1/2} cdot left(B^2_{ii}right)^{1/2},quad i = 1,ldots, n.tag 1$$
Proof of $(1)$
$c_{ii} =^color{red}{starstar} a_{i1} cdot b_{i1} + cdots +a_{in}cdot b_{in}= sumlimits_{j=1}^{n}a_{ij}cdot b_{ij}color{blue}{le^star} left(sumlimits_{j=1}^{n}a^2_{ij}right)^{1/2}cdot left(sumlimits_{j=1}^nb_{ij}^2right)^{1/2}=left(A^2_{ii}right)^{1/2}cdot left(B^2_{ii}right)^{1/2}quad text{QED}$
Thus, we have:
$$big[operatorname{trace} (AB)big]^2=left(sum_{i=1}^n c_{ii}right)^2leleft[ sum_{i=1}^n left(A^2_{ii}right)^{1/2}cdot left(B^2_{ii}right)^{1/2}right]^2color{blue}{le^star}sum_{i=1}^n A^2_{ii} cdot sum_{i=1}^n B^2_{ii}=operatorname{trace} (A^2) cdot operatorname{trace} (B^2)$$
$^color{blue}{star}$ We have applied the Cauchy-Schwarz inequality.
$^color{red}{starstar}$ Normally, it is:
$c_{ii} = a_{i1}b_{1i} + a_{i2}b_{2i} +cdots + a_{in}b_{ni},$
but notice that $b_{kell} = b_{ell k}$.
$endgroup$
add a comment |
$begingroup$
About question $#4$:
Notation: Let $C= AB$ and $K^2_{ii}$ denote the element at the $(i,i)$ position of the matrix $K^2$.
Firstly, due to symmetry of the $ntimes n$ matrices $A,B$, it is easy to prove that:
$$c_{ii} le left(A^2_{ii}right)^{1/2} cdot left(B^2_{ii}right)^{1/2},quad i = 1,ldots, n.tag 1$$
Proof of $(1)$
$c_{ii} =^color{red}{starstar} a_{i1} cdot b_{i1} + cdots +a_{in}cdot b_{in}= sumlimits_{j=1}^{n}a_{ij}cdot b_{ij}color{blue}{le^star} left(sumlimits_{j=1}^{n}a^2_{ij}right)^{1/2}cdot left(sumlimits_{j=1}^nb_{ij}^2right)^{1/2}=left(A^2_{ii}right)^{1/2}cdot left(B^2_{ii}right)^{1/2}quad text{QED}$
Thus, we have:
$$big[operatorname{trace} (AB)big]^2=left(sum_{i=1}^n c_{ii}right)^2leleft[ sum_{i=1}^n left(A^2_{ii}right)^{1/2}cdot left(B^2_{ii}right)^{1/2}right]^2color{blue}{le^star}sum_{i=1}^n A^2_{ii} cdot sum_{i=1}^n B^2_{ii}=operatorname{trace} (A^2) cdot operatorname{trace} (B^2)$$
$^color{blue}{star}$ We have applied the Cauchy-Schwarz inequality.
$^color{red}{starstar}$ Normally, it is:
$c_{ii} = a_{i1}b_{1i} + a_{i2}b_{2i} +cdots + a_{in}b_{ni},$
but notice that $b_{kell} = b_{ell k}$.
$endgroup$
add a comment |
$begingroup$
About question $#4$:
Notation: Let $C= AB$ and $K^2_{ii}$ denote the element at the $(i,i)$ position of the matrix $K^2$.
Firstly, due to symmetry of the $ntimes n$ matrices $A,B$, it is easy to prove that:
$$c_{ii} le left(A^2_{ii}right)^{1/2} cdot left(B^2_{ii}right)^{1/2},quad i = 1,ldots, n.tag 1$$
Proof of $(1)$
$c_{ii} =^color{red}{starstar} a_{i1} cdot b_{i1} + cdots +a_{in}cdot b_{in}= sumlimits_{j=1}^{n}a_{ij}cdot b_{ij}color{blue}{le^star} left(sumlimits_{j=1}^{n}a^2_{ij}right)^{1/2}cdot left(sumlimits_{j=1}^nb_{ij}^2right)^{1/2}=left(A^2_{ii}right)^{1/2}cdot left(B^2_{ii}right)^{1/2}quad text{QED}$
Thus, we have:
$$big[operatorname{trace} (AB)big]^2=left(sum_{i=1}^n c_{ii}right)^2leleft[ sum_{i=1}^n left(A^2_{ii}right)^{1/2}cdot left(B^2_{ii}right)^{1/2}right]^2color{blue}{le^star}sum_{i=1}^n A^2_{ii} cdot sum_{i=1}^n B^2_{ii}=operatorname{trace} (A^2) cdot operatorname{trace} (B^2)$$
$^color{blue}{star}$ We have applied the Cauchy-Schwarz inequality.
$^color{red}{starstar}$ Normally, it is:
$c_{ii} = a_{i1}b_{1i} + a_{i2}b_{2i} +cdots + a_{in}b_{ni},$
but notice that $b_{kell} = b_{ell k}$.
$endgroup$
About question $#4$:
Notation: Let $C= AB$ and $K^2_{ii}$ denote the element at the $(i,i)$ position of the matrix $K^2$.
Firstly, due to symmetry of the $ntimes n$ matrices $A,B$, it is easy to prove that:
$$c_{ii} le left(A^2_{ii}right)^{1/2} cdot left(B^2_{ii}right)^{1/2},quad i = 1,ldots, n.tag 1$$
Proof of $(1)$
$c_{ii} =^color{red}{starstar} a_{i1} cdot b_{i1} + cdots +a_{in}cdot b_{in}= sumlimits_{j=1}^{n}a_{ij}cdot b_{ij}color{blue}{le^star} left(sumlimits_{j=1}^{n}a^2_{ij}right)^{1/2}cdot left(sumlimits_{j=1}^nb_{ij}^2right)^{1/2}=left(A^2_{ii}right)^{1/2}cdot left(B^2_{ii}right)^{1/2}quad text{QED}$
Thus, we have:
$$big[operatorname{trace} (AB)big]^2=left(sum_{i=1}^n c_{ii}right)^2leleft[ sum_{i=1}^n left(A^2_{ii}right)^{1/2}cdot left(B^2_{ii}right)^{1/2}right]^2color{blue}{le^star}sum_{i=1}^n A^2_{ii} cdot sum_{i=1}^n B^2_{ii}=operatorname{trace} (A^2) cdot operatorname{trace} (B^2)$$
$^color{blue}{star}$ We have applied the Cauchy-Schwarz inequality.
$^color{red}{starstar}$ Normally, it is:
$c_{ii} = a_{i1}b_{1i} + a_{i2}b_{2i} +cdots + a_{in}b_{ni},$
but notice that $b_{kell} = b_{ell k}$.
edited Oct 28 '15 at 21:59
answered Oct 28 '15 at 19:16


thanasissdrthanasissdr
5,55111325
5,55111325
add a comment |
add a comment |
$begingroup$
Question No.3 is more related to the fact that for given numbers $x_1,dots,x_n$, the following inequality $$x_1^2+dots+x_n^2leq ,(x_1+dots+x_n)^2$$ holds only if $x_i$ are non-negative. Let $A$ be any diagonalizable matrix so that $A=TLambda T^{-1}$ and $A^2=TLambda^2 T^{-1}$. Thus, if $x_1,dots,x_n$ are the eigenvalues, then $mathrm{trace}(A^2)=x_1^2+dots+x_n^2 $ and $mathrm{trace}(A)^2=(x_1+dots+x_n)^2 $. Note that symmetric matrices are readily diagonalizable since they are normal.
Question No.4 is more related to the fact that trace is an inner product in the space of symmetric matrices. In fact, that inequality you have given is Cauchy-Schwartz indeed.
$endgroup$
add a comment |
$begingroup$
Question No.3 is more related to the fact that for given numbers $x_1,dots,x_n$, the following inequality $$x_1^2+dots+x_n^2leq ,(x_1+dots+x_n)^2$$ holds only if $x_i$ are non-negative. Let $A$ be any diagonalizable matrix so that $A=TLambda T^{-1}$ and $A^2=TLambda^2 T^{-1}$. Thus, if $x_1,dots,x_n$ are the eigenvalues, then $mathrm{trace}(A^2)=x_1^2+dots+x_n^2 $ and $mathrm{trace}(A)^2=(x_1+dots+x_n)^2 $. Note that symmetric matrices are readily diagonalizable since they are normal.
Question No.4 is more related to the fact that trace is an inner product in the space of symmetric matrices. In fact, that inequality you have given is Cauchy-Schwartz indeed.
$endgroup$
add a comment |
$begingroup$
Question No.3 is more related to the fact that for given numbers $x_1,dots,x_n$, the following inequality $$x_1^2+dots+x_n^2leq ,(x_1+dots+x_n)^2$$ holds only if $x_i$ are non-negative. Let $A$ be any diagonalizable matrix so that $A=TLambda T^{-1}$ and $A^2=TLambda^2 T^{-1}$. Thus, if $x_1,dots,x_n$ are the eigenvalues, then $mathrm{trace}(A^2)=x_1^2+dots+x_n^2 $ and $mathrm{trace}(A)^2=(x_1+dots+x_n)^2 $. Note that symmetric matrices are readily diagonalizable since they are normal.
Question No.4 is more related to the fact that trace is an inner product in the space of symmetric matrices. In fact, that inequality you have given is Cauchy-Schwartz indeed.
$endgroup$
Question No.3 is more related to the fact that for given numbers $x_1,dots,x_n$, the following inequality $$x_1^2+dots+x_n^2leq ,(x_1+dots+x_n)^2$$ holds only if $x_i$ are non-negative. Let $A$ be any diagonalizable matrix so that $A=TLambda T^{-1}$ and $A^2=TLambda^2 T^{-1}$. Thus, if $x_1,dots,x_n$ are the eigenvalues, then $mathrm{trace}(A^2)=x_1^2+dots+x_n^2 $ and $mathrm{trace}(A)^2=(x_1+dots+x_n)^2 $. Note that symmetric matrices are readily diagonalizable since they are normal.
Question No.4 is more related to the fact that trace is an inner product in the space of symmetric matrices. In fact, that inequality you have given is Cauchy-Schwartz indeed.
edited Oct 29 '15 at 9:18
answered Oct 29 '15 at 7:33
dineshdileepdineshdileep
5,96611735
5,96611735
add a comment |
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f1501499%2ftrace-of-symmetric-matrix-problems%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
aC8yFAHZkhzUypJUq,CZACzvBAP2NT1UoUEClrpAOU38PAx V7dq6ZzMA50AYpSYr
1
$begingroup$
If $lambda_1, ldots, lambda_n$ are the eigenvalues of $A$, what are the eigenvalues of $A^2$? And how the trace of a matrix is related to the eigenvalues?
$endgroup$
– thanasissdr
Oct 28 '15 at 9:08
$begingroup$
for both questions i just worked through an example which proved the questions right. I understand the eigenvalues for $A^2$ are the same as for $A$ but squared and when you add the trace you should get the same value as the eigenvalues added. I just wasn't sure in the second question as it says about all non-negative eigenvalues if just working through an example that proves it right really proves it for all cases and therefore had to go about it a different way
$endgroup$
– dmnte
Oct 28 '15 at 12:25
$begingroup$
for the fourth question I cant really prove it as much as I just show that it is true through a worked example, im guessing this is wrong but im not sure how else to do it
$endgroup$
– dmnte
Oct 28 '15 at 14:49