The Importance of Minors
$begingroup$
Let $A in K^{m times n}$ be a matrix. A $r times r$-minor is defined as the determinant of the $r times r$-matrix formed by $r$ rows and $r$ columns of the original matrix.
In a Linear Algebra lecture, my instructor told us that the notion of a minor is very vast and could alone be the subject of a whole course. We've then only introduced a possibility to find $text{rank}(A)$ using minors.
What else is there that makes minors cool/interesting?
linear-algebra determinant
$endgroup$
add a comment |
$begingroup$
Let $A in K^{m times n}$ be a matrix. A $r times r$-minor is defined as the determinant of the $r times r$-matrix formed by $r$ rows and $r$ columns of the original matrix.
In a Linear Algebra lecture, my instructor told us that the notion of a minor is very vast and could alone be the subject of a whole course. We've then only introduced a possibility to find $text{rank}(A)$ using minors.
What else is there that makes minors cool/interesting?
linear-algebra determinant
$endgroup$
$begingroup$
Yeah, I know that. Thanks.
$endgroup$
– Kezer
Apr 20 '18 at 17:33
add a comment |
$begingroup$
Let $A in K^{m times n}$ be a matrix. A $r times r$-minor is defined as the determinant of the $r times r$-matrix formed by $r$ rows and $r$ columns of the original matrix.
In a Linear Algebra lecture, my instructor told us that the notion of a minor is very vast and could alone be the subject of a whole course. We've then only introduced a possibility to find $text{rank}(A)$ using minors.
What else is there that makes minors cool/interesting?
linear-algebra determinant
$endgroup$
Let $A in K^{m times n}$ be a matrix. A $r times r$-minor is defined as the determinant of the $r times r$-matrix formed by $r$ rows and $r$ columns of the original matrix.
In a Linear Algebra lecture, my instructor told us that the notion of a minor is very vast and could alone be the subject of a whole course. We've then only introduced a possibility to find $text{rank}(A)$ using minors.
What else is there that makes minors cool/interesting?
linear-algebra determinant
linear-algebra determinant
asked Apr 20 '18 at 17:28
KezerKezer
1,313421
1,313421
$begingroup$
Yeah, I know that. Thanks.
$endgroup$
– Kezer
Apr 20 '18 at 17:33
add a comment |
$begingroup$
Yeah, I know that. Thanks.
$endgroup$
– Kezer
Apr 20 '18 at 17:33
$begingroup$
Yeah, I know that. Thanks.
$endgroup$
– Kezer
Apr 20 '18 at 17:33
$begingroup$
Yeah, I know that. Thanks.
$endgroup$
– Kezer
Apr 20 '18 at 17:33
add a comment |
3 Answers
3
active
oldest
votes
$begingroup$
Several determinantal identities involve the minors of a matrix.
For a first introduction, see Chapter 5 (the chapter on determinants) in my Notes on the combinatorial fundamentals of algebra, specifically Sections 6.12 (Laplace expansion), 6.15 (the adjugate matrix $operatorname{adj}A$, whose entries are more or less minors of $A$) 6.19 (Cramer's rule, which involves minors of a rectangular matrix), 6.20 (the Desnanot-Jacobi identity, often ascribed to Lewis Carroll since he made it into a technique for computing determinants), 6.21 (the Plücker relation, in one of its forms), 6.22 (Laplace expansion in several rows/columns), 6.23 (the formula for $detleft(A+Bright)$ as an alternating sum of minors of $A$ times complementary minors of $B$) and 6.25 (which includes the Jacobi complementary minor theorem). (Numbering of sections may change.)
My notes just scratch the surface; many deeper determinant identities are known since the 1800s. A particularly significant one is Sylvester's identity, which involves a determinant whose entries themselves are minors of a matrix. See, for example, Anna Karapiperi, Michela Redivo-Zaglia, Maria Rosaria Russo, Generalizations of Sylvester's determinantal identity, arXiv:1503.00519v1, and also Adam Berliner and Richard A. Brualdi, A combinatorial proof of the Dodgson/Muir determinantal identity.
Richard Swan's expository paper On the straightening law for minors of a matrix (arXiv:1605.06696) gives another nice identity between minors of a matrix (Theorem 2.6) and uses it to prove the so-called straightening law for letterplace algebras (these are coordinate rings of matrix spaces, i.e., polynomial rings in $mn$ indeterminates $x_{i,j}$ for $i leq m$ and $j leq n$). This straightening law is one of the pillars of characteristic-free invariant theory (i.e., invariant theory of classical groups over arbitrary commutative base rings), as exposed (e.g.) in Chapter 13 of Claudio Procesi's Lie Groups.
Various authors have tried to find "the most general determinantal identity"; the answer, of course, depends heavily on how one formalizes the question. Shreeram S. Abhyankar, Enumerative Combinatorics of Young tableaux, Dekker 1988 is one attempt at such an identity, I believe. For what I think is meant to be an expository introduction, see Sudhir R. Ghorpade, Abhyankar's work on Young tableaux; I admit I have read neither Abhyankar's book nor this introduction.
The irreducible representation of a symmetric group $S_n$ (over, say, $mathbb{C}$) are the so-called Specht modules. Nowadays, they are usually defined using Young tableaux, but when they were first defined by Specht in 1935, they were constructed as spans of products of certain minors of a generic matrix (in a letterplace algebra, if you wish). See Remark 2.9 in Mark Wildon, Representation theory of the symmetric group. In a sense, this is not surprising: the column antisymmetrizer in the definition of a Young symmetrizer corresponds to the alternating sum over permutations in the definition of a determinant. This allows for translating various results from the language of Young symmetrizers into the language of identities between matrix minors and backwards. (For example, the Garnir relations in the former language correspond to the Plücker relations in the latter.) I think this is an aspect of Schur-Weyl duality.
$endgroup$
$begingroup$
Awesome! Would you say that a few of these (the more advanced ones) are rather unknown for an average mathematician? Very interesting, indeed, though.
$endgroup$
– Kezer
Apr 27 '18 at 14:34
add a comment |
$begingroup$
An interesting application that has had a great deal of historical importance in mathematical economics:
Theorem: (Gale-Nikaido) If all the principal minors of the Jacobian of $F: mathbb{R}^n to mathbb{R}^n$ are positive, then $F$ is injective.
Source.
$endgroup$
add a comment |
$begingroup$
The following come to mind:
- There is all the applications on Wikipedia, for starters:
- You can study the so-called cofactor matrix of a square matrix, which gives you a way to express the inverse of an invertible matrix using only the determinant and the cofactors (i.e. the $(n-1)times(n-1)$ minors of an $ntimes n$ matrix).
Sylvester's criterion is a way to check whether a matrix is positive (semi)-definite by studying certain minors.
- The $r$-th coefficient of the characteristic polynomial of a square matrix is given by the sum of all $rtimes r$ minors.
- You can use the $mtimes m$ Minors of a generic $mtimes n$ matrix as coordinates for the $m$-th Grassmannian of $k^n$. This is the space of all linear subspaces of dimension $m$ of $k^n$, so it only really makes sense for $mle n$. It is a very interesting variety.
- If you start with an endomorphism of $k^n$, then it induces a map on the exterior power $bigwedge^r k^n$ for every $r$. This induced linear operator is described by the $rtimes r$ minors of the original endomorphism, see this great answer to a related question.
$endgroup$
$begingroup$
Thank you so much, it's very interesting!
$endgroup$
– Kezer
Apr 27 '18 at 14:29
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2746310%2fthe-importance-of-minors%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
3 Answers
3
active
oldest
votes
3 Answers
3
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
Several determinantal identities involve the minors of a matrix.
For a first introduction, see Chapter 5 (the chapter on determinants) in my Notes on the combinatorial fundamentals of algebra, specifically Sections 6.12 (Laplace expansion), 6.15 (the adjugate matrix $operatorname{adj}A$, whose entries are more or less minors of $A$) 6.19 (Cramer's rule, which involves minors of a rectangular matrix), 6.20 (the Desnanot-Jacobi identity, often ascribed to Lewis Carroll since he made it into a technique for computing determinants), 6.21 (the Plücker relation, in one of its forms), 6.22 (Laplace expansion in several rows/columns), 6.23 (the formula for $detleft(A+Bright)$ as an alternating sum of minors of $A$ times complementary minors of $B$) and 6.25 (which includes the Jacobi complementary minor theorem). (Numbering of sections may change.)
My notes just scratch the surface; many deeper determinant identities are known since the 1800s. A particularly significant one is Sylvester's identity, which involves a determinant whose entries themselves are minors of a matrix. See, for example, Anna Karapiperi, Michela Redivo-Zaglia, Maria Rosaria Russo, Generalizations of Sylvester's determinantal identity, arXiv:1503.00519v1, and also Adam Berliner and Richard A. Brualdi, A combinatorial proof of the Dodgson/Muir determinantal identity.
Richard Swan's expository paper On the straightening law for minors of a matrix (arXiv:1605.06696) gives another nice identity between minors of a matrix (Theorem 2.6) and uses it to prove the so-called straightening law for letterplace algebras (these are coordinate rings of matrix spaces, i.e., polynomial rings in $mn$ indeterminates $x_{i,j}$ for $i leq m$ and $j leq n$). This straightening law is one of the pillars of characteristic-free invariant theory (i.e., invariant theory of classical groups over arbitrary commutative base rings), as exposed (e.g.) in Chapter 13 of Claudio Procesi's Lie Groups.
Various authors have tried to find "the most general determinantal identity"; the answer, of course, depends heavily on how one formalizes the question. Shreeram S. Abhyankar, Enumerative Combinatorics of Young tableaux, Dekker 1988 is one attempt at such an identity, I believe. For what I think is meant to be an expository introduction, see Sudhir R. Ghorpade, Abhyankar's work on Young tableaux; I admit I have read neither Abhyankar's book nor this introduction.
The irreducible representation of a symmetric group $S_n$ (over, say, $mathbb{C}$) are the so-called Specht modules. Nowadays, they are usually defined using Young tableaux, but when they were first defined by Specht in 1935, they were constructed as spans of products of certain minors of a generic matrix (in a letterplace algebra, if you wish). See Remark 2.9 in Mark Wildon, Representation theory of the symmetric group. In a sense, this is not surprising: the column antisymmetrizer in the definition of a Young symmetrizer corresponds to the alternating sum over permutations in the definition of a determinant. This allows for translating various results from the language of Young symmetrizers into the language of identities between matrix minors and backwards. (For example, the Garnir relations in the former language correspond to the Plücker relations in the latter.) I think this is an aspect of Schur-Weyl duality.
$endgroup$
$begingroup$
Awesome! Would you say that a few of these (the more advanced ones) are rather unknown for an average mathematician? Very interesting, indeed, though.
$endgroup$
– Kezer
Apr 27 '18 at 14:34
add a comment |
$begingroup$
Several determinantal identities involve the minors of a matrix.
For a first introduction, see Chapter 5 (the chapter on determinants) in my Notes on the combinatorial fundamentals of algebra, specifically Sections 6.12 (Laplace expansion), 6.15 (the adjugate matrix $operatorname{adj}A$, whose entries are more or less minors of $A$) 6.19 (Cramer's rule, which involves minors of a rectangular matrix), 6.20 (the Desnanot-Jacobi identity, often ascribed to Lewis Carroll since he made it into a technique for computing determinants), 6.21 (the Plücker relation, in one of its forms), 6.22 (Laplace expansion in several rows/columns), 6.23 (the formula for $detleft(A+Bright)$ as an alternating sum of minors of $A$ times complementary minors of $B$) and 6.25 (which includes the Jacobi complementary minor theorem). (Numbering of sections may change.)
My notes just scratch the surface; many deeper determinant identities are known since the 1800s. A particularly significant one is Sylvester's identity, which involves a determinant whose entries themselves are minors of a matrix. See, for example, Anna Karapiperi, Michela Redivo-Zaglia, Maria Rosaria Russo, Generalizations of Sylvester's determinantal identity, arXiv:1503.00519v1, and also Adam Berliner and Richard A. Brualdi, A combinatorial proof of the Dodgson/Muir determinantal identity.
Richard Swan's expository paper On the straightening law for minors of a matrix (arXiv:1605.06696) gives another nice identity between minors of a matrix (Theorem 2.6) and uses it to prove the so-called straightening law for letterplace algebras (these are coordinate rings of matrix spaces, i.e., polynomial rings in $mn$ indeterminates $x_{i,j}$ for $i leq m$ and $j leq n$). This straightening law is one of the pillars of characteristic-free invariant theory (i.e., invariant theory of classical groups over arbitrary commutative base rings), as exposed (e.g.) in Chapter 13 of Claudio Procesi's Lie Groups.
Various authors have tried to find "the most general determinantal identity"; the answer, of course, depends heavily on how one formalizes the question. Shreeram S. Abhyankar, Enumerative Combinatorics of Young tableaux, Dekker 1988 is one attempt at such an identity, I believe. For what I think is meant to be an expository introduction, see Sudhir R. Ghorpade, Abhyankar's work on Young tableaux; I admit I have read neither Abhyankar's book nor this introduction.
The irreducible representation of a symmetric group $S_n$ (over, say, $mathbb{C}$) are the so-called Specht modules. Nowadays, they are usually defined using Young tableaux, but when they were first defined by Specht in 1935, they were constructed as spans of products of certain minors of a generic matrix (in a letterplace algebra, if you wish). See Remark 2.9 in Mark Wildon, Representation theory of the symmetric group. In a sense, this is not surprising: the column antisymmetrizer in the definition of a Young symmetrizer corresponds to the alternating sum over permutations in the definition of a determinant. This allows for translating various results from the language of Young symmetrizers into the language of identities between matrix minors and backwards. (For example, the Garnir relations in the former language correspond to the Plücker relations in the latter.) I think this is an aspect of Schur-Weyl duality.
$endgroup$
$begingroup$
Awesome! Would you say that a few of these (the more advanced ones) are rather unknown for an average mathematician? Very interesting, indeed, though.
$endgroup$
– Kezer
Apr 27 '18 at 14:34
add a comment |
$begingroup$
Several determinantal identities involve the minors of a matrix.
For a first introduction, see Chapter 5 (the chapter on determinants) in my Notes on the combinatorial fundamentals of algebra, specifically Sections 6.12 (Laplace expansion), 6.15 (the adjugate matrix $operatorname{adj}A$, whose entries are more or less minors of $A$) 6.19 (Cramer's rule, which involves minors of a rectangular matrix), 6.20 (the Desnanot-Jacobi identity, often ascribed to Lewis Carroll since he made it into a technique for computing determinants), 6.21 (the Plücker relation, in one of its forms), 6.22 (Laplace expansion in several rows/columns), 6.23 (the formula for $detleft(A+Bright)$ as an alternating sum of minors of $A$ times complementary minors of $B$) and 6.25 (which includes the Jacobi complementary minor theorem). (Numbering of sections may change.)
My notes just scratch the surface; many deeper determinant identities are known since the 1800s. A particularly significant one is Sylvester's identity, which involves a determinant whose entries themselves are minors of a matrix. See, for example, Anna Karapiperi, Michela Redivo-Zaglia, Maria Rosaria Russo, Generalizations of Sylvester's determinantal identity, arXiv:1503.00519v1, and also Adam Berliner and Richard A. Brualdi, A combinatorial proof of the Dodgson/Muir determinantal identity.
Richard Swan's expository paper On the straightening law for minors of a matrix (arXiv:1605.06696) gives another nice identity between minors of a matrix (Theorem 2.6) and uses it to prove the so-called straightening law for letterplace algebras (these are coordinate rings of matrix spaces, i.e., polynomial rings in $mn$ indeterminates $x_{i,j}$ for $i leq m$ and $j leq n$). This straightening law is one of the pillars of characteristic-free invariant theory (i.e., invariant theory of classical groups over arbitrary commutative base rings), as exposed (e.g.) in Chapter 13 of Claudio Procesi's Lie Groups.
Various authors have tried to find "the most general determinantal identity"; the answer, of course, depends heavily on how one formalizes the question. Shreeram S. Abhyankar, Enumerative Combinatorics of Young tableaux, Dekker 1988 is one attempt at such an identity, I believe. For what I think is meant to be an expository introduction, see Sudhir R. Ghorpade, Abhyankar's work on Young tableaux; I admit I have read neither Abhyankar's book nor this introduction.
The irreducible representation of a symmetric group $S_n$ (over, say, $mathbb{C}$) are the so-called Specht modules. Nowadays, they are usually defined using Young tableaux, but when they were first defined by Specht in 1935, they were constructed as spans of products of certain minors of a generic matrix (in a letterplace algebra, if you wish). See Remark 2.9 in Mark Wildon, Representation theory of the symmetric group. In a sense, this is not surprising: the column antisymmetrizer in the definition of a Young symmetrizer corresponds to the alternating sum over permutations in the definition of a determinant. This allows for translating various results from the language of Young symmetrizers into the language of identities between matrix minors and backwards. (For example, the Garnir relations in the former language correspond to the Plücker relations in the latter.) I think this is an aspect of Schur-Weyl duality.
$endgroup$
Several determinantal identities involve the minors of a matrix.
For a first introduction, see Chapter 5 (the chapter on determinants) in my Notes on the combinatorial fundamentals of algebra, specifically Sections 6.12 (Laplace expansion), 6.15 (the adjugate matrix $operatorname{adj}A$, whose entries are more or less minors of $A$) 6.19 (Cramer's rule, which involves minors of a rectangular matrix), 6.20 (the Desnanot-Jacobi identity, often ascribed to Lewis Carroll since he made it into a technique for computing determinants), 6.21 (the Plücker relation, in one of its forms), 6.22 (Laplace expansion in several rows/columns), 6.23 (the formula for $detleft(A+Bright)$ as an alternating sum of minors of $A$ times complementary minors of $B$) and 6.25 (which includes the Jacobi complementary minor theorem). (Numbering of sections may change.)
My notes just scratch the surface; many deeper determinant identities are known since the 1800s. A particularly significant one is Sylvester's identity, which involves a determinant whose entries themselves are minors of a matrix. See, for example, Anna Karapiperi, Michela Redivo-Zaglia, Maria Rosaria Russo, Generalizations of Sylvester's determinantal identity, arXiv:1503.00519v1, and also Adam Berliner and Richard A. Brualdi, A combinatorial proof of the Dodgson/Muir determinantal identity.
Richard Swan's expository paper On the straightening law for minors of a matrix (arXiv:1605.06696) gives another nice identity between minors of a matrix (Theorem 2.6) and uses it to prove the so-called straightening law for letterplace algebras (these are coordinate rings of matrix spaces, i.e., polynomial rings in $mn$ indeterminates $x_{i,j}$ for $i leq m$ and $j leq n$). This straightening law is one of the pillars of characteristic-free invariant theory (i.e., invariant theory of classical groups over arbitrary commutative base rings), as exposed (e.g.) in Chapter 13 of Claudio Procesi's Lie Groups.
Various authors have tried to find "the most general determinantal identity"; the answer, of course, depends heavily on how one formalizes the question. Shreeram S. Abhyankar, Enumerative Combinatorics of Young tableaux, Dekker 1988 is one attempt at such an identity, I believe. For what I think is meant to be an expository introduction, see Sudhir R. Ghorpade, Abhyankar's work on Young tableaux; I admit I have read neither Abhyankar's book nor this introduction.
The irreducible representation of a symmetric group $S_n$ (over, say, $mathbb{C}$) are the so-called Specht modules. Nowadays, they are usually defined using Young tableaux, but when they were first defined by Specht in 1935, they were constructed as spans of products of certain minors of a generic matrix (in a letterplace algebra, if you wish). See Remark 2.9 in Mark Wildon, Representation theory of the symmetric group. In a sense, this is not surprising: the column antisymmetrizer in the definition of a Young symmetrizer corresponds to the alternating sum over permutations in the definition of a determinant. This allows for translating various results from the language of Young symmetrizers into the language of identities between matrix minors and backwards. (For example, the Garnir relations in the former language correspond to the Plücker relations in the latter.) I think this is an aspect of Schur-Weyl duality.
edited Jan 10 at 1:57
answered Apr 26 '18 at 20:34
darij grinbergdarij grinberg
10.5k33062
10.5k33062
$begingroup$
Awesome! Would you say that a few of these (the more advanced ones) are rather unknown for an average mathematician? Very interesting, indeed, though.
$endgroup$
– Kezer
Apr 27 '18 at 14:34
add a comment |
$begingroup$
Awesome! Would you say that a few of these (the more advanced ones) are rather unknown for an average mathematician? Very interesting, indeed, though.
$endgroup$
– Kezer
Apr 27 '18 at 14:34
$begingroup$
Awesome! Would you say that a few of these (the more advanced ones) are rather unknown for an average mathematician? Very interesting, indeed, though.
$endgroup$
– Kezer
Apr 27 '18 at 14:34
$begingroup$
Awesome! Would you say that a few of these (the more advanced ones) are rather unknown for an average mathematician? Very interesting, indeed, though.
$endgroup$
– Kezer
Apr 27 '18 at 14:34
add a comment |
$begingroup$
An interesting application that has had a great deal of historical importance in mathematical economics:
Theorem: (Gale-Nikaido) If all the principal minors of the Jacobian of $F: mathbb{R}^n to mathbb{R}^n$ are positive, then $F$ is injective.
Source.
$endgroup$
add a comment |
$begingroup$
An interesting application that has had a great deal of historical importance in mathematical economics:
Theorem: (Gale-Nikaido) If all the principal minors of the Jacobian of $F: mathbb{R}^n to mathbb{R}^n$ are positive, then $F$ is injective.
Source.
$endgroup$
add a comment |
$begingroup$
An interesting application that has had a great deal of historical importance in mathematical economics:
Theorem: (Gale-Nikaido) If all the principal minors of the Jacobian of $F: mathbb{R}^n to mathbb{R}^n$ are positive, then $F$ is injective.
Source.
$endgroup$
An interesting application that has had a great deal of historical importance in mathematical economics:
Theorem: (Gale-Nikaido) If all the principal minors of the Jacobian of $F: mathbb{R}^n to mathbb{R}^n$ are positive, then $F$ is injective.
Source.
edited Apr 26 '18 at 20:12
darij grinberg
10.5k33062
10.5k33062
answered Apr 22 '18 at 19:43
Pete CaradonnaPete Caradonna
1,3891721
1,3891721
add a comment |
add a comment |
$begingroup$
The following come to mind:
- There is all the applications on Wikipedia, for starters:
- You can study the so-called cofactor matrix of a square matrix, which gives you a way to express the inverse of an invertible matrix using only the determinant and the cofactors (i.e. the $(n-1)times(n-1)$ minors of an $ntimes n$ matrix).
Sylvester's criterion is a way to check whether a matrix is positive (semi)-definite by studying certain minors.
- The $r$-th coefficient of the characteristic polynomial of a square matrix is given by the sum of all $rtimes r$ minors.
- You can use the $mtimes m$ Minors of a generic $mtimes n$ matrix as coordinates for the $m$-th Grassmannian of $k^n$. This is the space of all linear subspaces of dimension $m$ of $k^n$, so it only really makes sense for $mle n$. It is a very interesting variety.
- If you start with an endomorphism of $k^n$, then it induces a map on the exterior power $bigwedge^r k^n$ for every $r$. This induced linear operator is described by the $rtimes r$ minors of the original endomorphism, see this great answer to a related question.
$endgroup$
$begingroup$
Thank you so much, it's very interesting!
$endgroup$
– Kezer
Apr 27 '18 at 14:29
add a comment |
$begingroup$
The following come to mind:
- There is all the applications on Wikipedia, for starters:
- You can study the so-called cofactor matrix of a square matrix, which gives you a way to express the inverse of an invertible matrix using only the determinant and the cofactors (i.e. the $(n-1)times(n-1)$ minors of an $ntimes n$ matrix).
Sylvester's criterion is a way to check whether a matrix is positive (semi)-definite by studying certain minors.
- The $r$-th coefficient of the characteristic polynomial of a square matrix is given by the sum of all $rtimes r$ minors.
- You can use the $mtimes m$ Minors of a generic $mtimes n$ matrix as coordinates for the $m$-th Grassmannian of $k^n$. This is the space of all linear subspaces of dimension $m$ of $k^n$, so it only really makes sense for $mle n$. It is a very interesting variety.
- If you start with an endomorphism of $k^n$, then it induces a map on the exterior power $bigwedge^r k^n$ for every $r$. This induced linear operator is described by the $rtimes r$ minors of the original endomorphism, see this great answer to a related question.
$endgroup$
$begingroup$
Thank you so much, it's very interesting!
$endgroup$
– Kezer
Apr 27 '18 at 14:29
add a comment |
$begingroup$
The following come to mind:
- There is all the applications on Wikipedia, for starters:
- You can study the so-called cofactor matrix of a square matrix, which gives you a way to express the inverse of an invertible matrix using only the determinant and the cofactors (i.e. the $(n-1)times(n-1)$ minors of an $ntimes n$ matrix).
Sylvester's criterion is a way to check whether a matrix is positive (semi)-definite by studying certain minors.
- The $r$-th coefficient of the characteristic polynomial of a square matrix is given by the sum of all $rtimes r$ minors.
- You can use the $mtimes m$ Minors of a generic $mtimes n$ matrix as coordinates for the $m$-th Grassmannian of $k^n$. This is the space of all linear subspaces of dimension $m$ of $k^n$, so it only really makes sense for $mle n$. It is a very interesting variety.
- If you start with an endomorphism of $k^n$, then it induces a map on the exterior power $bigwedge^r k^n$ for every $r$. This induced linear operator is described by the $rtimes r$ minors of the original endomorphism, see this great answer to a related question.
$endgroup$
The following come to mind:
- There is all the applications on Wikipedia, for starters:
- You can study the so-called cofactor matrix of a square matrix, which gives you a way to express the inverse of an invertible matrix using only the determinant and the cofactors (i.e. the $(n-1)times(n-1)$ minors of an $ntimes n$ matrix).
Sylvester's criterion is a way to check whether a matrix is positive (semi)-definite by studying certain minors.
- The $r$-th coefficient of the characteristic polynomial of a square matrix is given by the sum of all $rtimes r$ minors.
- You can use the $mtimes m$ Minors of a generic $mtimes n$ matrix as coordinates for the $m$-th Grassmannian of $k^n$. This is the space of all linear subspaces of dimension $m$ of $k^n$, so it only really makes sense for $mle n$. It is a very interesting variety.
- If you start with an endomorphism of $k^n$, then it induces a map on the exterior power $bigwedge^r k^n$ for every $r$. This induced linear operator is described by the $rtimes r$ minors of the original endomorphism, see this great answer to a related question.
answered Apr 24 '18 at 10:24
Jesko HüttenhainJesko Hüttenhain
10.4k12356
10.4k12356
$begingroup$
Thank you so much, it's very interesting!
$endgroup$
– Kezer
Apr 27 '18 at 14:29
add a comment |
$begingroup$
Thank you so much, it's very interesting!
$endgroup$
– Kezer
Apr 27 '18 at 14:29
$begingroup$
Thank you so much, it's very interesting!
$endgroup$
– Kezer
Apr 27 '18 at 14:29
$begingroup$
Thank you so much, it's very interesting!
$endgroup$
– Kezer
Apr 27 '18 at 14:29
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2746310%2fthe-importance-of-minors%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
$begingroup$
Yeah, I know that. Thanks.
$endgroup$
– Kezer
Apr 20 '18 at 17:33