Suppose $v,w, v + w$ are all eigenvectors of the linear operator $phi:V to V$. Prove that $v, w, v + w$ all...
$begingroup$
The Problem:
Just as it is in the title: Suppose $v,w, v + w$ are all eigenvectors of the linear operator $phi:V to V$. Prove that $v, w, v + w$ all have the same eigenvalue.
My Approach:
Let $phi(v) = alpha v$, $phi(w) = beta w$, and $phi(v + w) = gamma (v + w)$. We then have that
$$ phi(v) + phi(w) = gamma v + gamma w implies alpha v + beta w = gamma v + gamma w implies (alpha - gamma)v = (gamma - beta)w. $$
This means that $v$ and $w$ are scalar multiplies of one another; say, $w = lambda v$...
I feel like this is supposed to tell me something. My thought is that, since $v$ and $w$ are linearly dependent, they must occupy the same eigenspace; but I can't seem to prove that...
linear-algebra eigenvalues-eigenvectors
$endgroup$
add a comment |
$begingroup$
The Problem:
Just as it is in the title: Suppose $v,w, v + w$ are all eigenvectors of the linear operator $phi:V to V$. Prove that $v, w, v + w$ all have the same eigenvalue.
My Approach:
Let $phi(v) = alpha v$, $phi(w) = beta w$, and $phi(v + w) = gamma (v + w)$. We then have that
$$ phi(v) + phi(w) = gamma v + gamma w implies alpha v + beta w = gamma v + gamma w implies (alpha - gamma)v = (gamma - beta)w. $$
This means that $v$ and $w$ are scalar multiplies of one another; say, $w = lambda v$...
I feel like this is supposed to tell me something. My thought is that, since $v$ and $w$ are linearly dependent, they must occupy the same eigenspace; but I can't seem to prove that...
linear-algebra eigenvalues-eigenvectors
$endgroup$
add a comment |
$begingroup$
The Problem:
Just as it is in the title: Suppose $v,w, v + w$ are all eigenvectors of the linear operator $phi:V to V$. Prove that $v, w, v + w$ all have the same eigenvalue.
My Approach:
Let $phi(v) = alpha v$, $phi(w) = beta w$, and $phi(v + w) = gamma (v + w)$. We then have that
$$ phi(v) + phi(w) = gamma v + gamma w implies alpha v + beta w = gamma v + gamma w implies (alpha - gamma)v = (gamma - beta)w. $$
This means that $v$ and $w$ are scalar multiplies of one another; say, $w = lambda v$...
I feel like this is supposed to tell me something. My thought is that, since $v$ and $w$ are linearly dependent, they must occupy the same eigenspace; but I can't seem to prove that...
linear-algebra eigenvalues-eigenvectors
$endgroup$
The Problem:
Just as it is in the title: Suppose $v,w, v + w$ are all eigenvectors of the linear operator $phi:V to V$. Prove that $v, w, v + w$ all have the same eigenvalue.
My Approach:
Let $phi(v) = alpha v$, $phi(w) = beta w$, and $phi(v + w) = gamma (v + w)$. We then have that
$$ phi(v) + phi(w) = gamma v + gamma w implies alpha v + beta w = gamma v + gamma w implies (alpha - gamma)v = (gamma - beta)w. $$
This means that $v$ and $w$ are scalar multiplies of one another; say, $w = lambda v$...
I feel like this is supposed to tell me something. My thought is that, since $v$ and $w$ are linearly dependent, they must occupy the same eigenspace; but I can't seem to prove that...
linear-algebra eigenvalues-eigenvectors
linear-algebra eigenvalues-eigenvectors
asked Jan 22 at 14:22
thisisourconcerndudethisisourconcerndude
1,1321122
1,1321122
add a comment |
add a comment |
4 Answers
4
active
oldest
votes
$begingroup$
From $(alpha-gamma)v=(gamma-beta)w$ you cannot necessarily conclude that $v$ and $w$ are scalar multiples of one another. There are two possibilities:
- If $alpha-gamma=0$, then the LHS is the zero vector. Since $w neq 0$ (it is an eigenvector), it must also be that $gamma-beta=0$. So $alpha=beta=gamma$.
- If $alpha-gamma neq 0$, then the LHS is not zero, so the RHS is not zero either (in particlar $gamma-beta neq 0$). So $v=frac{gamma-beta}{alpha-gamma} w = cw$ where $c=frac{gamma-beta}{alpha-gamma} neq 0$. Now re-write the equation $phi(v)=alpha v$ using $v=cw$, and similarly rewrite $phi(v+w)=gamma(v+w)$. This will show you $alpha=beta=gamma$.
$endgroup$
add a comment |
$begingroup$
Apply $phi$ once more on $(alpha-gamma)v = (gamma-beta)w$ to obtain
$$(alpha-gamma)alpha v = (gamma-beta)beta w$$
On the other hand, multiplying the first identity by $alpha$ yields $$(alpha-gamma)alpha v = (gamma-beta)alpha w$$
so $$(gamma-beta)alpha w = (gamma-beta)beta w implies (gamma-beta)(alpha-beta)w = 0$$
Since $w$ is an eigenvector, we have $w ne 0$ so $(gamma-beta)(alpha-beta)=0$.
Hence $alpha = beta$ or $beta = gamma$. From either of those it easily follows $alpha=beta=gamma$.
$endgroup$
add a comment |
$begingroup$
If $v,w$ are not linearly independent, the result is trivial $w=cv,phi(w)=phi(cv)=cphi(v)=calpha v=alpha w$,...
$(alpha-gamma)v+(beta-gamma)w=$ since $v,w$ are linearly independent, $alpha=beta=gamma$.
$endgroup$
$begingroup$
You don't know that $v$ and $w$ are linearly independent.
$endgroup$
– kccu
Jan 22 at 14:25
add a comment |
$begingroup$
Suppose $phi(v) = lambda v$, $phi(w) = mu w$ and that $phi(v + w) = kappa(v + w)$, where $lambda$, $mu$, and $kappa$ are all scalars. Then we ahve
$$ lambda v + mu w = T(v + w) = kappa(v + w).$$
We therefore deduce that
$$ (kappa - lambda) v + (kappa - mu) w = 0.$$
If $v$ and $w$ are linearly independant, then $kappa=lambda=mu$. If not, then $v$ and $w$ are scalar multiples; in this case they are in the same eigenspace, and the result folllows.
$endgroup$
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3083219%2fsuppose-v-w-v-w-are-all-eigenvectors-of-the-linear-operator-phiv-to-v%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
4 Answers
4
active
oldest
votes
4 Answers
4
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
From $(alpha-gamma)v=(gamma-beta)w$ you cannot necessarily conclude that $v$ and $w$ are scalar multiples of one another. There are two possibilities:
- If $alpha-gamma=0$, then the LHS is the zero vector. Since $w neq 0$ (it is an eigenvector), it must also be that $gamma-beta=0$. So $alpha=beta=gamma$.
- If $alpha-gamma neq 0$, then the LHS is not zero, so the RHS is not zero either (in particlar $gamma-beta neq 0$). So $v=frac{gamma-beta}{alpha-gamma} w = cw$ where $c=frac{gamma-beta}{alpha-gamma} neq 0$. Now re-write the equation $phi(v)=alpha v$ using $v=cw$, and similarly rewrite $phi(v+w)=gamma(v+w)$. This will show you $alpha=beta=gamma$.
$endgroup$
add a comment |
$begingroup$
From $(alpha-gamma)v=(gamma-beta)w$ you cannot necessarily conclude that $v$ and $w$ are scalar multiples of one another. There are two possibilities:
- If $alpha-gamma=0$, then the LHS is the zero vector. Since $w neq 0$ (it is an eigenvector), it must also be that $gamma-beta=0$. So $alpha=beta=gamma$.
- If $alpha-gamma neq 0$, then the LHS is not zero, so the RHS is not zero either (in particlar $gamma-beta neq 0$). So $v=frac{gamma-beta}{alpha-gamma} w = cw$ where $c=frac{gamma-beta}{alpha-gamma} neq 0$. Now re-write the equation $phi(v)=alpha v$ using $v=cw$, and similarly rewrite $phi(v+w)=gamma(v+w)$. This will show you $alpha=beta=gamma$.
$endgroup$
add a comment |
$begingroup$
From $(alpha-gamma)v=(gamma-beta)w$ you cannot necessarily conclude that $v$ and $w$ are scalar multiples of one another. There are two possibilities:
- If $alpha-gamma=0$, then the LHS is the zero vector. Since $w neq 0$ (it is an eigenvector), it must also be that $gamma-beta=0$. So $alpha=beta=gamma$.
- If $alpha-gamma neq 0$, then the LHS is not zero, so the RHS is not zero either (in particlar $gamma-beta neq 0$). So $v=frac{gamma-beta}{alpha-gamma} w = cw$ where $c=frac{gamma-beta}{alpha-gamma} neq 0$. Now re-write the equation $phi(v)=alpha v$ using $v=cw$, and similarly rewrite $phi(v+w)=gamma(v+w)$. This will show you $alpha=beta=gamma$.
$endgroup$
From $(alpha-gamma)v=(gamma-beta)w$ you cannot necessarily conclude that $v$ and $w$ are scalar multiples of one another. There are two possibilities:
- If $alpha-gamma=0$, then the LHS is the zero vector. Since $w neq 0$ (it is an eigenvector), it must also be that $gamma-beta=0$. So $alpha=beta=gamma$.
- If $alpha-gamma neq 0$, then the LHS is not zero, so the RHS is not zero either (in particlar $gamma-beta neq 0$). So $v=frac{gamma-beta}{alpha-gamma} w = cw$ where $c=frac{gamma-beta}{alpha-gamma} neq 0$. Now re-write the equation $phi(v)=alpha v$ using $v=cw$, and similarly rewrite $phi(v+w)=gamma(v+w)$. This will show you $alpha=beta=gamma$.
answered Jan 22 at 14:29
kccukccu
10.5k11228
10.5k11228
add a comment |
add a comment |
$begingroup$
Apply $phi$ once more on $(alpha-gamma)v = (gamma-beta)w$ to obtain
$$(alpha-gamma)alpha v = (gamma-beta)beta w$$
On the other hand, multiplying the first identity by $alpha$ yields $$(alpha-gamma)alpha v = (gamma-beta)alpha w$$
so $$(gamma-beta)alpha w = (gamma-beta)beta w implies (gamma-beta)(alpha-beta)w = 0$$
Since $w$ is an eigenvector, we have $w ne 0$ so $(gamma-beta)(alpha-beta)=0$.
Hence $alpha = beta$ or $beta = gamma$. From either of those it easily follows $alpha=beta=gamma$.
$endgroup$
add a comment |
$begingroup$
Apply $phi$ once more on $(alpha-gamma)v = (gamma-beta)w$ to obtain
$$(alpha-gamma)alpha v = (gamma-beta)beta w$$
On the other hand, multiplying the first identity by $alpha$ yields $$(alpha-gamma)alpha v = (gamma-beta)alpha w$$
so $$(gamma-beta)alpha w = (gamma-beta)beta w implies (gamma-beta)(alpha-beta)w = 0$$
Since $w$ is an eigenvector, we have $w ne 0$ so $(gamma-beta)(alpha-beta)=0$.
Hence $alpha = beta$ or $beta = gamma$. From either of those it easily follows $alpha=beta=gamma$.
$endgroup$
add a comment |
$begingroup$
Apply $phi$ once more on $(alpha-gamma)v = (gamma-beta)w$ to obtain
$$(alpha-gamma)alpha v = (gamma-beta)beta w$$
On the other hand, multiplying the first identity by $alpha$ yields $$(alpha-gamma)alpha v = (gamma-beta)alpha w$$
so $$(gamma-beta)alpha w = (gamma-beta)beta w implies (gamma-beta)(alpha-beta)w = 0$$
Since $w$ is an eigenvector, we have $w ne 0$ so $(gamma-beta)(alpha-beta)=0$.
Hence $alpha = beta$ or $beta = gamma$. From either of those it easily follows $alpha=beta=gamma$.
$endgroup$
Apply $phi$ once more on $(alpha-gamma)v = (gamma-beta)w$ to obtain
$$(alpha-gamma)alpha v = (gamma-beta)beta w$$
On the other hand, multiplying the first identity by $alpha$ yields $$(alpha-gamma)alpha v = (gamma-beta)alpha w$$
so $$(gamma-beta)alpha w = (gamma-beta)beta w implies (gamma-beta)(alpha-beta)w = 0$$
Since $w$ is an eigenvector, we have $w ne 0$ so $(gamma-beta)(alpha-beta)=0$.
Hence $alpha = beta$ or $beta = gamma$. From either of those it easily follows $alpha=beta=gamma$.
answered Jan 22 at 14:32
mechanodroidmechanodroid
27.8k62447
27.8k62447
add a comment |
add a comment |
$begingroup$
If $v,w$ are not linearly independent, the result is trivial $w=cv,phi(w)=phi(cv)=cphi(v)=calpha v=alpha w$,...
$(alpha-gamma)v+(beta-gamma)w=$ since $v,w$ are linearly independent, $alpha=beta=gamma$.
$endgroup$
$begingroup$
You don't know that $v$ and $w$ are linearly independent.
$endgroup$
– kccu
Jan 22 at 14:25
add a comment |
$begingroup$
If $v,w$ are not linearly independent, the result is trivial $w=cv,phi(w)=phi(cv)=cphi(v)=calpha v=alpha w$,...
$(alpha-gamma)v+(beta-gamma)w=$ since $v,w$ are linearly independent, $alpha=beta=gamma$.
$endgroup$
$begingroup$
You don't know that $v$ and $w$ are linearly independent.
$endgroup$
– kccu
Jan 22 at 14:25
add a comment |
$begingroup$
If $v,w$ are not linearly independent, the result is trivial $w=cv,phi(w)=phi(cv)=cphi(v)=calpha v=alpha w$,...
$(alpha-gamma)v+(beta-gamma)w=$ since $v,w$ are linearly independent, $alpha=beta=gamma$.
$endgroup$
If $v,w$ are not linearly independent, the result is trivial $w=cv,phi(w)=phi(cv)=cphi(v)=calpha v=alpha w$,...
$(alpha-gamma)v+(beta-gamma)w=$ since $v,w$ are linearly independent, $alpha=beta=gamma$.
edited Jan 22 at 14:26
answered Jan 22 at 14:25
Tsemo AristideTsemo Aristide
58.8k11445
58.8k11445
$begingroup$
You don't know that $v$ and $w$ are linearly independent.
$endgroup$
– kccu
Jan 22 at 14:25
add a comment |
$begingroup$
You don't know that $v$ and $w$ are linearly independent.
$endgroup$
– kccu
Jan 22 at 14:25
$begingroup$
You don't know that $v$ and $w$ are linearly independent.
$endgroup$
– kccu
Jan 22 at 14:25
$begingroup$
You don't know that $v$ and $w$ are linearly independent.
$endgroup$
– kccu
Jan 22 at 14:25
add a comment |
$begingroup$
Suppose $phi(v) = lambda v$, $phi(w) = mu w$ and that $phi(v + w) = kappa(v + w)$, where $lambda$, $mu$, and $kappa$ are all scalars. Then we ahve
$$ lambda v + mu w = T(v + w) = kappa(v + w).$$
We therefore deduce that
$$ (kappa - lambda) v + (kappa - mu) w = 0.$$
If $v$ and $w$ are linearly independant, then $kappa=lambda=mu$. If not, then $v$ and $w$ are scalar multiples; in this case they are in the same eigenspace, and the result folllows.
$endgroup$
add a comment |
$begingroup$
Suppose $phi(v) = lambda v$, $phi(w) = mu w$ and that $phi(v + w) = kappa(v + w)$, where $lambda$, $mu$, and $kappa$ are all scalars. Then we ahve
$$ lambda v + mu w = T(v + w) = kappa(v + w).$$
We therefore deduce that
$$ (kappa - lambda) v + (kappa - mu) w = 0.$$
If $v$ and $w$ are linearly independant, then $kappa=lambda=mu$. If not, then $v$ and $w$ are scalar multiples; in this case they are in the same eigenspace, and the result folllows.
$endgroup$
add a comment |
$begingroup$
Suppose $phi(v) = lambda v$, $phi(w) = mu w$ and that $phi(v + w) = kappa(v + w)$, where $lambda$, $mu$, and $kappa$ are all scalars. Then we ahve
$$ lambda v + mu w = T(v + w) = kappa(v + w).$$
We therefore deduce that
$$ (kappa - lambda) v + (kappa - mu) w = 0.$$
If $v$ and $w$ are linearly independant, then $kappa=lambda=mu$. If not, then $v$ and $w$ are scalar multiples; in this case they are in the same eigenspace, and the result folllows.
$endgroup$
Suppose $phi(v) = lambda v$, $phi(w) = mu w$ and that $phi(v + w) = kappa(v + w)$, where $lambda$, $mu$, and $kappa$ are all scalars. Then we ahve
$$ lambda v + mu w = T(v + w) = kappa(v + w).$$
We therefore deduce that
$$ (kappa - lambda) v + (kappa - mu) w = 0.$$
If $v$ and $w$ are linearly independant, then $kappa=lambda=mu$. If not, then $v$ and $w$ are scalar multiples; in this case they are in the same eigenspace, and the result folllows.
answered Jan 22 at 14:34
ncmathsadistncmathsadist
42.9k260103
42.9k260103
add a comment |
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3083219%2fsuppose-v-w-v-w-are-all-eigenvectors-of-the-linear-operator-phiv-to-v%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown