The Importance of Minors












3












$begingroup$


Let $A in K^{m times n}$ be a matrix. A $r times r$-minor is defined as the determinant of the $r times r$-matrix formed by $r$ rows and $r$ columns of the original matrix.



In a Linear Algebra lecture, my instructor told us that the notion of a minor is very vast and could alone be the subject of a whole course. We've then only introduced a possibility to find $text{rank}(A)$ using minors.



What else is there that makes minors cool/interesting?










share|cite|improve this question









$endgroup$












  • $begingroup$
    Yeah, I know that. Thanks.
    $endgroup$
    – Kezer
    Apr 20 '18 at 17:33
















3












$begingroup$


Let $A in K^{m times n}$ be a matrix. A $r times r$-minor is defined as the determinant of the $r times r$-matrix formed by $r$ rows and $r$ columns of the original matrix.



In a Linear Algebra lecture, my instructor told us that the notion of a minor is very vast and could alone be the subject of a whole course. We've then only introduced a possibility to find $text{rank}(A)$ using minors.



What else is there that makes minors cool/interesting?










share|cite|improve this question









$endgroup$












  • $begingroup$
    Yeah, I know that. Thanks.
    $endgroup$
    – Kezer
    Apr 20 '18 at 17:33














3












3








3





$begingroup$


Let $A in K^{m times n}$ be a matrix. A $r times r$-minor is defined as the determinant of the $r times r$-matrix formed by $r$ rows and $r$ columns of the original matrix.



In a Linear Algebra lecture, my instructor told us that the notion of a minor is very vast and could alone be the subject of a whole course. We've then only introduced a possibility to find $text{rank}(A)$ using minors.



What else is there that makes minors cool/interesting?










share|cite|improve this question









$endgroup$




Let $A in K^{m times n}$ be a matrix. A $r times r$-minor is defined as the determinant of the $r times r$-matrix formed by $r$ rows and $r$ columns of the original matrix.



In a Linear Algebra lecture, my instructor told us that the notion of a minor is very vast and could alone be the subject of a whole course. We've then only introduced a possibility to find $text{rank}(A)$ using minors.



What else is there that makes minors cool/interesting?







linear-algebra determinant






share|cite|improve this question













share|cite|improve this question











share|cite|improve this question




share|cite|improve this question










asked Apr 20 '18 at 17:28









KezerKezer

1,313421




1,313421












  • $begingroup$
    Yeah, I know that. Thanks.
    $endgroup$
    – Kezer
    Apr 20 '18 at 17:33


















  • $begingroup$
    Yeah, I know that. Thanks.
    $endgroup$
    – Kezer
    Apr 20 '18 at 17:33
















$begingroup$
Yeah, I know that. Thanks.
$endgroup$
– Kezer
Apr 20 '18 at 17:33




$begingroup$
Yeah, I know that. Thanks.
$endgroup$
– Kezer
Apr 20 '18 at 17:33










3 Answers
3






active

oldest

votes


















2





+50







$begingroup$

Several determinantal identities involve the minors of a matrix.



For a first introduction, see Chapter 5 (the chapter on determinants) in my Notes on the combinatorial fundamentals of algebra, specifically Sections 6.12 (Laplace expansion), 6.15 (the adjugate matrix $operatorname{adj}A$, whose entries are more or less minors of $A$) 6.19 (Cramer's rule, which involves minors of a rectangular matrix), 6.20 (the Desnanot-Jacobi identity, often ascribed to Lewis Carroll since he made it into a technique for computing determinants), 6.21 (the Plücker relation, in one of its forms), 6.22 (Laplace expansion in several rows/columns), 6.23 (the formula for $detleft(A+Bright)$ as an alternating sum of minors of $A$ times complementary minors of $B$) and 6.25 (which includes the Jacobi complementary minor theorem). (Numbering of sections may change.)



My notes just scratch the surface; many deeper determinant identities are known since the 1800s. A particularly significant one is Sylvester's identity, which involves a determinant whose entries themselves are minors of a matrix. See, for example, Anna Karapiperi, Michela Redivo-Zaglia, Maria Rosaria Russo, Generalizations of Sylvester's determinantal identity, arXiv:1503.00519v1, and also Adam Berliner and Richard A. Brualdi, A combinatorial proof of the Dodgson/Muir determinantal identity.



Richard Swan's expository paper On the straightening law for minors of a matrix (arXiv:1605.06696) gives another nice identity between minors of a matrix (Theorem 2.6) and uses it to prove the so-called straightening law for letterplace algebras (these are coordinate rings of matrix spaces, i.e., polynomial rings in $mn$ indeterminates $x_{i,j}$ for $i leq m$ and $j leq n$). This straightening law is one of the pillars of characteristic-free invariant theory (i.e., invariant theory of classical groups over arbitrary commutative base rings), as exposed (e.g.) in Chapter 13 of Claudio Procesi's Lie Groups.



Various authors have tried to find "the most general determinantal identity"; the answer, of course, depends heavily on how one formalizes the question. Shreeram S. Abhyankar, Enumerative Combinatorics of Young tableaux, Dekker 1988 is one attempt at such an identity, I believe. For what I think is meant to be an expository introduction, see Sudhir R. Ghorpade, Abhyankar's work on Young tableaux; I admit I have read neither Abhyankar's book nor this introduction.



The irreducible representation of a symmetric group $S_n$ (over, say, $mathbb{C}$) are the so-called Specht modules. Nowadays, they are usually defined using Young tableaux, but when they were first defined by Specht in 1935, they were constructed as spans of products of certain minors of a generic matrix (in a letterplace algebra, if you wish). See Remark 2.9 in Mark Wildon, Representation theory of the symmetric group. In a sense, this is not surprising: the column antisymmetrizer in the definition of a Young symmetrizer corresponds to the alternating sum over permutations in the definition of a determinant. This allows for translating various results from the language of Young symmetrizers into the language of identities between matrix minors and backwards. (For example, the Garnir relations in the former language correspond to the Plücker relations in the latter.) I think this is an aspect of Schur-Weyl duality.






share|cite|improve this answer











$endgroup$













  • $begingroup$
    Awesome! Would you say that a few of these (the more advanced ones) are rather unknown for an average mathematician? Very interesting, indeed, though.
    $endgroup$
    – Kezer
    Apr 27 '18 at 14:34



















6












$begingroup$

An interesting application that has had a great deal of historical importance in mathematical economics:




Theorem: (Gale-Nikaido) If all the principal minors of the Jacobian of $F: mathbb{R}^n to mathbb{R}^n$ are positive, then $F$ is injective.




Source.






share|cite|improve this answer











$endgroup$





















    3












    $begingroup$

    The following come to mind:




    1. There is all the applications on Wikipedia, for starters:


      • You can study the so-called cofactor matrix of a square matrix, which gives you a way to express the inverse of an invertible matrix using only the determinant and the cofactors (i.e. the $(n-1)times(n-1)$ minors of an $ntimes n$ matrix).


      • Sylvester's criterion is a way to check whether a matrix is positive (semi)-definite by studying certain minors.



    2. The $r$-th coefficient of the characteristic polynomial of a square matrix is given by the sum of all $rtimes r$ minors.

    3. You can use the $mtimes m$ Minors of a generic $mtimes n$ matrix as coordinates for the $m$-th Grassmannian of $k^n$. This is the space of all linear subspaces of dimension $m$ of $k^n$, so it only really makes sense for $mle n$. It is a very interesting variety.

    4. If you start with an endomorphism of $k^n$, then it induces a map on the exterior power $bigwedge^r k^n$ for every $r$. This induced linear operator is described by the $rtimes r$ minors of the original endomorphism, see this great answer to a related question.






    share|cite|improve this answer









    $endgroup$













    • $begingroup$
      Thank you so much, it's very interesting!
      $endgroup$
      – Kezer
      Apr 27 '18 at 14:29











    Your Answer





    StackExchange.ifUsing("editor", function () {
    return StackExchange.using("mathjaxEditing", function () {
    StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
    StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
    });
    });
    }, "mathjax-editing");

    StackExchange.ready(function() {
    var channelOptions = {
    tags: "".split(" "),
    id: "69"
    };
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function() {
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled) {
    StackExchange.using("snippets", function() {
    createEditor();
    });
    }
    else {
    createEditor();
    }
    });

    function createEditor() {
    StackExchange.prepareEditor({
    heartbeatType: 'answer',
    autoActivateHeartbeat: false,
    convertImagesToLinks: true,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    imageUploader: {
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    },
    noCode: true, onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    });


    }
    });














    draft saved

    draft discarded


















    StackExchange.ready(
    function () {
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2746310%2fthe-importance-of-minors%23new-answer', 'question_page');
    }
    );

    Post as a guest















    Required, but never shown

























    3 Answers
    3






    active

    oldest

    votes








    3 Answers
    3






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    2





    +50







    $begingroup$

    Several determinantal identities involve the minors of a matrix.



    For a first introduction, see Chapter 5 (the chapter on determinants) in my Notes on the combinatorial fundamentals of algebra, specifically Sections 6.12 (Laplace expansion), 6.15 (the adjugate matrix $operatorname{adj}A$, whose entries are more or less minors of $A$) 6.19 (Cramer's rule, which involves minors of a rectangular matrix), 6.20 (the Desnanot-Jacobi identity, often ascribed to Lewis Carroll since he made it into a technique for computing determinants), 6.21 (the Plücker relation, in one of its forms), 6.22 (Laplace expansion in several rows/columns), 6.23 (the formula for $detleft(A+Bright)$ as an alternating sum of minors of $A$ times complementary minors of $B$) and 6.25 (which includes the Jacobi complementary minor theorem). (Numbering of sections may change.)



    My notes just scratch the surface; many deeper determinant identities are known since the 1800s. A particularly significant one is Sylvester's identity, which involves a determinant whose entries themselves are minors of a matrix. See, for example, Anna Karapiperi, Michela Redivo-Zaglia, Maria Rosaria Russo, Generalizations of Sylvester's determinantal identity, arXiv:1503.00519v1, and also Adam Berliner and Richard A. Brualdi, A combinatorial proof of the Dodgson/Muir determinantal identity.



    Richard Swan's expository paper On the straightening law for minors of a matrix (arXiv:1605.06696) gives another nice identity between minors of a matrix (Theorem 2.6) and uses it to prove the so-called straightening law for letterplace algebras (these are coordinate rings of matrix spaces, i.e., polynomial rings in $mn$ indeterminates $x_{i,j}$ for $i leq m$ and $j leq n$). This straightening law is one of the pillars of characteristic-free invariant theory (i.e., invariant theory of classical groups over arbitrary commutative base rings), as exposed (e.g.) in Chapter 13 of Claudio Procesi's Lie Groups.



    Various authors have tried to find "the most general determinantal identity"; the answer, of course, depends heavily on how one formalizes the question. Shreeram S. Abhyankar, Enumerative Combinatorics of Young tableaux, Dekker 1988 is one attempt at such an identity, I believe. For what I think is meant to be an expository introduction, see Sudhir R. Ghorpade, Abhyankar's work on Young tableaux; I admit I have read neither Abhyankar's book nor this introduction.



    The irreducible representation of a symmetric group $S_n$ (over, say, $mathbb{C}$) are the so-called Specht modules. Nowadays, they are usually defined using Young tableaux, but when they were first defined by Specht in 1935, they were constructed as spans of products of certain minors of a generic matrix (in a letterplace algebra, if you wish). See Remark 2.9 in Mark Wildon, Representation theory of the symmetric group. In a sense, this is not surprising: the column antisymmetrizer in the definition of a Young symmetrizer corresponds to the alternating sum over permutations in the definition of a determinant. This allows for translating various results from the language of Young symmetrizers into the language of identities between matrix minors and backwards. (For example, the Garnir relations in the former language correspond to the Plücker relations in the latter.) I think this is an aspect of Schur-Weyl duality.






    share|cite|improve this answer











    $endgroup$













    • $begingroup$
      Awesome! Would you say that a few of these (the more advanced ones) are rather unknown for an average mathematician? Very interesting, indeed, though.
      $endgroup$
      – Kezer
      Apr 27 '18 at 14:34
















    2





    +50







    $begingroup$

    Several determinantal identities involve the minors of a matrix.



    For a first introduction, see Chapter 5 (the chapter on determinants) in my Notes on the combinatorial fundamentals of algebra, specifically Sections 6.12 (Laplace expansion), 6.15 (the adjugate matrix $operatorname{adj}A$, whose entries are more or less minors of $A$) 6.19 (Cramer's rule, which involves minors of a rectangular matrix), 6.20 (the Desnanot-Jacobi identity, often ascribed to Lewis Carroll since he made it into a technique for computing determinants), 6.21 (the Plücker relation, in one of its forms), 6.22 (Laplace expansion in several rows/columns), 6.23 (the formula for $detleft(A+Bright)$ as an alternating sum of minors of $A$ times complementary minors of $B$) and 6.25 (which includes the Jacobi complementary minor theorem). (Numbering of sections may change.)



    My notes just scratch the surface; many deeper determinant identities are known since the 1800s. A particularly significant one is Sylvester's identity, which involves a determinant whose entries themselves are minors of a matrix. See, for example, Anna Karapiperi, Michela Redivo-Zaglia, Maria Rosaria Russo, Generalizations of Sylvester's determinantal identity, arXiv:1503.00519v1, and also Adam Berliner and Richard A. Brualdi, A combinatorial proof of the Dodgson/Muir determinantal identity.



    Richard Swan's expository paper On the straightening law for minors of a matrix (arXiv:1605.06696) gives another nice identity between minors of a matrix (Theorem 2.6) and uses it to prove the so-called straightening law for letterplace algebras (these are coordinate rings of matrix spaces, i.e., polynomial rings in $mn$ indeterminates $x_{i,j}$ for $i leq m$ and $j leq n$). This straightening law is one of the pillars of characteristic-free invariant theory (i.e., invariant theory of classical groups over arbitrary commutative base rings), as exposed (e.g.) in Chapter 13 of Claudio Procesi's Lie Groups.



    Various authors have tried to find "the most general determinantal identity"; the answer, of course, depends heavily on how one formalizes the question. Shreeram S. Abhyankar, Enumerative Combinatorics of Young tableaux, Dekker 1988 is one attempt at such an identity, I believe. For what I think is meant to be an expository introduction, see Sudhir R. Ghorpade, Abhyankar's work on Young tableaux; I admit I have read neither Abhyankar's book nor this introduction.



    The irreducible representation of a symmetric group $S_n$ (over, say, $mathbb{C}$) are the so-called Specht modules. Nowadays, they are usually defined using Young tableaux, but when they were first defined by Specht in 1935, they were constructed as spans of products of certain minors of a generic matrix (in a letterplace algebra, if you wish). See Remark 2.9 in Mark Wildon, Representation theory of the symmetric group. In a sense, this is not surprising: the column antisymmetrizer in the definition of a Young symmetrizer corresponds to the alternating sum over permutations in the definition of a determinant. This allows for translating various results from the language of Young symmetrizers into the language of identities between matrix minors and backwards. (For example, the Garnir relations in the former language correspond to the Plücker relations in the latter.) I think this is an aspect of Schur-Weyl duality.






    share|cite|improve this answer











    $endgroup$













    • $begingroup$
      Awesome! Would you say that a few of these (the more advanced ones) are rather unknown for an average mathematician? Very interesting, indeed, though.
      $endgroup$
      – Kezer
      Apr 27 '18 at 14:34














    2





    +50







    2





    +50



    2




    +50



    $begingroup$

    Several determinantal identities involve the minors of a matrix.



    For a first introduction, see Chapter 5 (the chapter on determinants) in my Notes on the combinatorial fundamentals of algebra, specifically Sections 6.12 (Laplace expansion), 6.15 (the adjugate matrix $operatorname{adj}A$, whose entries are more or less minors of $A$) 6.19 (Cramer's rule, which involves minors of a rectangular matrix), 6.20 (the Desnanot-Jacobi identity, often ascribed to Lewis Carroll since he made it into a technique for computing determinants), 6.21 (the Plücker relation, in one of its forms), 6.22 (Laplace expansion in several rows/columns), 6.23 (the formula for $detleft(A+Bright)$ as an alternating sum of minors of $A$ times complementary minors of $B$) and 6.25 (which includes the Jacobi complementary minor theorem). (Numbering of sections may change.)



    My notes just scratch the surface; many deeper determinant identities are known since the 1800s. A particularly significant one is Sylvester's identity, which involves a determinant whose entries themselves are minors of a matrix. See, for example, Anna Karapiperi, Michela Redivo-Zaglia, Maria Rosaria Russo, Generalizations of Sylvester's determinantal identity, arXiv:1503.00519v1, and also Adam Berliner and Richard A. Brualdi, A combinatorial proof of the Dodgson/Muir determinantal identity.



    Richard Swan's expository paper On the straightening law for minors of a matrix (arXiv:1605.06696) gives another nice identity between minors of a matrix (Theorem 2.6) and uses it to prove the so-called straightening law for letterplace algebras (these are coordinate rings of matrix spaces, i.e., polynomial rings in $mn$ indeterminates $x_{i,j}$ for $i leq m$ and $j leq n$). This straightening law is one of the pillars of characteristic-free invariant theory (i.e., invariant theory of classical groups over arbitrary commutative base rings), as exposed (e.g.) in Chapter 13 of Claudio Procesi's Lie Groups.



    Various authors have tried to find "the most general determinantal identity"; the answer, of course, depends heavily on how one formalizes the question. Shreeram S. Abhyankar, Enumerative Combinatorics of Young tableaux, Dekker 1988 is one attempt at such an identity, I believe. For what I think is meant to be an expository introduction, see Sudhir R. Ghorpade, Abhyankar's work on Young tableaux; I admit I have read neither Abhyankar's book nor this introduction.



    The irreducible representation of a symmetric group $S_n$ (over, say, $mathbb{C}$) are the so-called Specht modules. Nowadays, they are usually defined using Young tableaux, but when they were first defined by Specht in 1935, they were constructed as spans of products of certain minors of a generic matrix (in a letterplace algebra, if you wish). See Remark 2.9 in Mark Wildon, Representation theory of the symmetric group. In a sense, this is not surprising: the column antisymmetrizer in the definition of a Young symmetrizer corresponds to the alternating sum over permutations in the definition of a determinant. This allows for translating various results from the language of Young symmetrizers into the language of identities between matrix minors and backwards. (For example, the Garnir relations in the former language correspond to the Plücker relations in the latter.) I think this is an aspect of Schur-Weyl duality.






    share|cite|improve this answer











    $endgroup$



    Several determinantal identities involve the minors of a matrix.



    For a first introduction, see Chapter 5 (the chapter on determinants) in my Notes on the combinatorial fundamentals of algebra, specifically Sections 6.12 (Laplace expansion), 6.15 (the adjugate matrix $operatorname{adj}A$, whose entries are more or less minors of $A$) 6.19 (Cramer's rule, which involves minors of a rectangular matrix), 6.20 (the Desnanot-Jacobi identity, often ascribed to Lewis Carroll since he made it into a technique for computing determinants), 6.21 (the Plücker relation, in one of its forms), 6.22 (Laplace expansion in several rows/columns), 6.23 (the formula for $detleft(A+Bright)$ as an alternating sum of minors of $A$ times complementary minors of $B$) and 6.25 (which includes the Jacobi complementary minor theorem). (Numbering of sections may change.)



    My notes just scratch the surface; many deeper determinant identities are known since the 1800s. A particularly significant one is Sylvester's identity, which involves a determinant whose entries themselves are minors of a matrix. See, for example, Anna Karapiperi, Michela Redivo-Zaglia, Maria Rosaria Russo, Generalizations of Sylvester's determinantal identity, arXiv:1503.00519v1, and also Adam Berliner and Richard A. Brualdi, A combinatorial proof of the Dodgson/Muir determinantal identity.



    Richard Swan's expository paper On the straightening law for minors of a matrix (arXiv:1605.06696) gives another nice identity between minors of a matrix (Theorem 2.6) and uses it to prove the so-called straightening law for letterplace algebras (these are coordinate rings of matrix spaces, i.e., polynomial rings in $mn$ indeterminates $x_{i,j}$ for $i leq m$ and $j leq n$). This straightening law is one of the pillars of characteristic-free invariant theory (i.e., invariant theory of classical groups over arbitrary commutative base rings), as exposed (e.g.) in Chapter 13 of Claudio Procesi's Lie Groups.



    Various authors have tried to find "the most general determinantal identity"; the answer, of course, depends heavily on how one formalizes the question. Shreeram S. Abhyankar, Enumerative Combinatorics of Young tableaux, Dekker 1988 is one attempt at such an identity, I believe. For what I think is meant to be an expository introduction, see Sudhir R. Ghorpade, Abhyankar's work on Young tableaux; I admit I have read neither Abhyankar's book nor this introduction.



    The irreducible representation of a symmetric group $S_n$ (over, say, $mathbb{C}$) are the so-called Specht modules. Nowadays, they are usually defined using Young tableaux, but when they were first defined by Specht in 1935, they were constructed as spans of products of certain minors of a generic matrix (in a letterplace algebra, if you wish). See Remark 2.9 in Mark Wildon, Representation theory of the symmetric group. In a sense, this is not surprising: the column antisymmetrizer in the definition of a Young symmetrizer corresponds to the alternating sum over permutations in the definition of a determinant. This allows for translating various results from the language of Young symmetrizers into the language of identities between matrix minors and backwards. (For example, the Garnir relations in the former language correspond to the Plücker relations in the latter.) I think this is an aspect of Schur-Weyl duality.







    share|cite|improve this answer














    share|cite|improve this answer



    share|cite|improve this answer








    edited Jan 10 at 1:57

























    answered Apr 26 '18 at 20:34









    darij grinbergdarij grinberg

    10.5k33062




    10.5k33062












    • $begingroup$
      Awesome! Would you say that a few of these (the more advanced ones) are rather unknown for an average mathematician? Very interesting, indeed, though.
      $endgroup$
      – Kezer
      Apr 27 '18 at 14:34


















    • $begingroup$
      Awesome! Would you say that a few of these (the more advanced ones) are rather unknown for an average mathematician? Very interesting, indeed, though.
      $endgroup$
      – Kezer
      Apr 27 '18 at 14:34
















    $begingroup$
    Awesome! Would you say that a few of these (the more advanced ones) are rather unknown for an average mathematician? Very interesting, indeed, though.
    $endgroup$
    – Kezer
    Apr 27 '18 at 14:34




    $begingroup$
    Awesome! Would you say that a few of these (the more advanced ones) are rather unknown for an average mathematician? Very interesting, indeed, though.
    $endgroup$
    – Kezer
    Apr 27 '18 at 14:34











    6












    $begingroup$

    An interesting application that has had a great deal of historical importance in mathematical economics:




    Theorem: (Gale-Nikaido) If all the principal minors of the Jacobian of $F: mathbb{R}^n to mathbb{R}^n$ are positive, then $F$ is injective.




    Source.






    share|cite|improve this answer











    $endgroup$


















      6












      $begingroup$

      An interesting application that has had a great deal of historical importance in mathematical economics:




      Theorem: (Gale-Nikaido) If all the principal minors of the Jacobian of $F: mathbb{R}^n to mathbb{R}^n$ are positive, then $F$ is injective.




      Source.






      share|cite|improve this answer











      $endgroup$
















        6












        6








        6





        $begingroup$

        An interesting application that has had a great deal of historical importance in mathematical economics:




        Theorem: (Gale-Nikaido) If all the principal minors of the Jacobian of $F: mathbb{R}^n to mathbb{R}^n$ are positive, then $F$ is injective.




        Source.






        share|cite|improve this answer











        $endgroup$



        An interesting application that has had a great deal of historical importance in mathematical economics:




        Theorem: (Gale-Nikaido) If all the principal minors of the Jacobian of $F: mathbb{R}^n to mathbb{R}^n$ are positive, then $F$ is injective.




        Source.







        share|cite|improve this answer














        share|cite|improve this answer



        share|cite|improve this answer








        edited Apr 26 '18 at 20:12









        darij grinberg

        10.5k33062




        10.5k33062










        answered Apr 22 '18 at 19:43









        Pete CaradonnaPete Caradonna

        1,3891721




        1,3891721























            3












            $begingroup$

            The following come to mind:




            1. There is all the applications on Wikipedia, for starters:


              • You can study the so-called cofactor matrix of a square matrix, which gives you a way to express the inverse of an invertible matrix using only the determinant and the cofactors (i.e. the $(n-1)times(n-1)$ minors of an $ntimes n$ matrix).


              • Sylvester's criterion is a way to check whether a matrix is positive (semi)-definite by studying certain minors.



            2. The $r$-th coefficient of the characteristic polynomial of a square matrix is given by the sum of all $rtimes r$ minors.

            3. You can use the $mtimes m$ Minors of a generic $mtimes n$ matrix as coordinates for the $m$-th Grassmannian of $k^n$. This is the space of all linear subspaces of dimension $m$ of $k^n$, so it only really makes sense for $mle n$. It is a very interesting variety.

            4. If you start with an endomorphism of $k^n$, then it induces a map on the exterior power $bigwedge^r k^n$ for every $r$. This induced linear operator is described by the $rtimes r$ minors of the original endomorphism, see this great answer to a related question.






            share|cite|improve this answer









            $endgroup$













            • $begingroup$
              Thank you so much, it's very interesting!
              $endgroup$
              – Kezer
              Apr 27 '18 at 14:29
















            3












            $begingroup$

            The following come to mind:




            1. There is all the applications on Wikipedia, for starters:


              • You can study the so-called cofactor matrix of a square matrix, which gives you a way to express the inverse of an invertible matrix using only the determinant and the cofactors (i.e. the $(n-1)times(n-1)$ minors of an $ntimes n$ matrix).


              • Sylvester's criterion is a way to check whether a matrix is positive (semi)-definite by studying certain minors.



            2. The $r$-th coefficient of the characteristic polynomial of a square matrix is given by the sum of all $rtimes r$ minors.

            3. You can use the $mtimes m$ Minors of a generic $mtimes n$ matrix as coordinates for the $m$-th Grassmannian of $k^n$. This is the space of all linear subspaces of dimension $m$ of $k^n$, so it only really makes sense for $mle n$. It is a very interesting variety.

            4. If you start with an endomorphism of $k^n$, then it induces a map on the exterior power $bigwedge^r k^n$ for every $r$. This induced linear operator is described by the $rtimes r$ minors of the original endomorphism, see this great answer to a related question.






            share|cite|improve this answer









            $endgroup$













            • $begingroup$
              Thank you so much, it's very interesting!
              $endgroup$
              – Kezer
              Apr 27 '18 at 14:29














            3












            3








            3





            $begingroup$

            The following come to mind:




            1. There is all the applications on Wikipedia, for starters:


              • You can study the so-called cofactor matrix of a square matrix, which gives you a way to express the inverse of an invertible matrix using only the determinant and the cofactors (i.e. the $(n-1)times(n-1)$ minors of an $ntimes n$ matrix).


              • Sylvester's criterion is a way to check whether a matrix is positive (semi)-definite by studying certain minors.



            2. The $r$-th coefficient of the characteristic polynomial of a square matrix is given by the sum of all $rtimes r$ minors.

            3. You can use the $mtimes m$ Minors of a generic $mtimes n$ matrix as coordinates for the $m$-th Grassmannian of $k^n$. This is the space of all linear subspaces of dimension $m$ of $k^n$, so it only really makes sense for $mle n$. It is a very interesting variety.

            4. If you start with an endomorphism of $k^n$, then it induces a map on the exterior power $bigwedge^r k^n$ for every $r$. This induced linear operator is described by the $rtimes r$ minors of the original endomorphism, see this great answer to a related question.






            share|cite|improve this answer









            $endgroup$



            The following come to mind:




            1. There is all the applications on Wikipedia, for starters:


              • You can study the so-called cofactor matrix of a square matrix, which gives you a way to express the inverse of an invertible matrix using only the determinant and the cofactors (i.e. the $(n-1)times(n-1)$ minors of an $ntimes n$ matrix).


              • Sylvester's criterion is a way to check whether a matrix is positive (semi)-definite by studying certain minors.



            2. The $r$-th coefficient of the characteristic polynomial of a square matrix is given by the sum of all $rtimes r$ minors.

            3. You can use the $mtimes m$ Minors of a generic $mtimes n$ matrix as coordinates for the $m$-th Grassmannian of $k^n$. This is the space of all linear subspaces of dimension $m$ of $k^n$, so it only really makes sense for $mle n$. It is a very interesting variety.

            4. If you start with an endomorphism of $k^n$, then it induces a map on the exterior power $bigwedge^r k^n$ for every $r$. This induced linear operator is described by the $rtimes r$ minors of the original endomorphism, see this great answer to a related question.







            share|cite|improve this answer












            share|cite|improve this answer



            share|cite|improve this answer










            answered Apr 24 '18 at 10:24









            Jesko HüttenhainJesko Hüttenhain

            10.4k12356




            10.4k12356












            • $begingroup$
              Thank you so much, it's very interesting!
              $endgroup$
              – Kezer
              Apr 27 '18 at 14:29


















            • $begingroup$
              Thank you so much, it's very interesting!
              $endgroup$
              – Kezer
              Apr 27 '18 at 14:29
















            $begingroup$
            Thank you so much, it's very interesting!
            $endgroup$
            – Kezer
            Apr 27 '18 at 14:29




            $begingroup$
            Thank you so much, it's very interesting!
            $endgroup$
            – Kezer
            Apr 27 '18 at 14:29


















            draft saved

            draft discarded




















































            Thanks for contributing an answer to Mathematics Stack Exchange!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            Use MathJax to format equations. MathJax reference.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2746310%2fthe-importance-of-minors%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            Mario Kart Wii

            The Binding of Isaac: Rebirth/Afterbirth

            What does “Dominus providebit” mean?