Similarity of two tridiagonal matrices












2












$begingroup$


I am considering two complex symmetric tridiagonal matrices. First, A is a tridiagonal matrix with identical non-diagonal elements :



A = $begin{pmatrix} ig_1 & kappa & 0 & 0 \ kappa & -ig_2 & kappa & 0 \ 0 & kappa & -ig_1 & kappa \ 0 & 0 & kappa & i g_2 end{pmatrix}$



A second matrix B has non-identical non-diagonal elements and is a function of some control parameters $X = [x_1, x_2, x_3, x_4]^T$:



B(X) = $begin{pmatrix} ix_1 & kappa_1 & 0 & 0 \ kappa_1 & ix_2 & kappa_2 & 0 \ 0 & kappa_2 & ix_3 & kappa_3 \ 0 & 0 & kappa_3 & ix_4 end{pmatrix}$



The problem I am trying to solve is finding X such that $B(X) sim A$.



So far, I have look into Jordan decomposition. I used $T_A J_A = A T_A$ and $T_{B^T} J_B = B^T T_{B^T}$ (in which T and J correspond to the transformation and Jordan matrices). If I note $S = T_A {T_{B^T}^T}$, the problem reduces to:



$X , | , SB(X) - AS = 0$



, which I can put into a numerical code to solve.



What I am looking for would be another solution, which I could use to derive an analytical formula to find X as a function of the different parameters. Does anyone has something to suggest? Maybe a different type of decomposition, like LU?



Best










share|cite|improve this question









$endgroup$

















    2












    $begingroup$


    I am considering two complex symmetric tridiagonal matrices. First, A is a tridiagonal matrix with identical non-diagonal elements :



    A = $begin{pmatrix} ig_1 & kappa & 0 & 0 \ kappa & -ig_2 & kappa & 0 \ 0 & kappa & -ig_1 & kappa \ 0 & 0 & kappa & i g_2 end{pmatrix}$



    A second matrix B has non-identical non-diagonal elements and is a function of some control parameters $X = [x_1, x_2, x_3, x_4]^T$:



    B(X) = $begin{pmatrix} ix_1 & kappa_1 & 0 & 0 \ kappa_1 & ix_2 & kappa_2 & 0 \ 0 & kappa_2 & ix_3 & kappa_3 \ 0 & 0 & kappa_3 & ix_4 end{pmatrix}$



    The problem I am trying to solve is finding X such that $B(X) sim A$.



    So far, I have look into Jordan decomposition. I used $T_A J_A = A T_A$ and $T_{B^T} J_B = B^T T_{B^T}$ (in which T and J correspond to the transformation and Jordan matrices). If I note $S = T_A {T_{B^T}^T}$, the problem reduces to:



    $X , | , SB(X) - AS = 0$



    , which I can put into a numerical code to solve.



    What I am looking for would be another solution, which I could use to derive an analytical formula to find X as a function of the different parameters. Does anyone has something to suggest? Maybe a different type of decomposition, like LU?



    Best










    share|cite|improve this question









    $endgroup$















      2












      2








      2





      $begingroup$


      I am considering two complex symmetric tridiagonal matrices. First, A is a tridiagonal matrix with identical non-diagonal elements :



      A = $begin{pmatrix} ig_1 & kappa & 0 & 0 \ kappa & -ig_2 & kappa & 0 \ 0 & kappa & -ig_1 & kappa \ 0 & 0 & kappa & i g_2 end{pmatrix}$



      A second matrix B has non-identical non-diagonal elements and is a function of some control parameters $X = [x_1, x_2, x_3, x_4]^T$:



      B(X) = $begin{pmatrix} ix_1 & kappa_1 & 0 & 0 \ kappa_1 & ix_2 & kappa_2 & 0 \ 0 & kappa_2 & ix_3 & kappa_3 \ 0 & 0 & kappa_3 & ix_4 end{pmatrix}$



      The problem I am trying to solve is finding X such that $B(X) sim A$.



      So far, I have look into Jordan decomposition. I used $T_A J_A = A T_A$ and $T_{B^T} J_B = B^T T_{B^T}$ (in which T and J correspond to the transformation and Jordan matrices). If I note $S = T_A {T_{B^T}^T}$, the problem reduces to:



      $X , | , SB(X) - AS = 0$



      , which I can put into a numerical code to solve.



      What I am looking for would be another solution, which I could use to derive an analytical formula to find X as a function of the different parameters. Does anyone has something to suggest? Maybe a different type of decomposition, like LU?



      Best










      share|cite|improve this question









      $endgroup$




      I am considering two complex symmetric tridiagonal matrices. First, A is a tridiagonal matrix with identical non-diagonal elements :



      A = $begin{pmatrix} ig_1 & kappa & 0 & 0 \ kappa & -ig_2 & kappa & 0 \ 0 & kappa & -ig_1 & kappa \ 0 & 0 & kappa & i g_2 end{pmatrix}$



      A second matrix B has non-identical non-diagonal elements and is a function of some control parameters $X = [x_1, x_2, x_3, x_4]^T$:



      B(X) = $begin{pmatrix} ix_1 & kappa_1 & 0 & 0 \ kappa_1 & ix_2 & kappa_2 & 0 \ 0 & kappa_2 & ix_3 & kappa_3 \ 0 & 0 & kappa_3 & ix_4 end{pmatrix}$



      The problem I am trying to solve is finding X such that $B(X) sim A$.



      So far, I have look into Jordan decomposition. I used $T_A J_A = A T_A$ and $T_{B^T} J_B = B^T T_{B^T}$ (in which T and J correspond to the transformation and Jordan matrices). If I note $S = T_A {T_{B^T}^T}$, the problem reduces to:



      $X , | , SB(X) - AS = 0$



      , which I can put into a numerical code to solve.



      What I am looking for would be another solution, which I could use to derive an analytical formula to find X as a function of the different parameters. Does anyone has something to suggest? Maybe a different type of decomposition, like LU?



      Best







      linear-algebra matrices






      share|cite|improve this question













      share|cite|improve this question











      share|cite|improve this question




      share|cite|improve this question










      asked Jan 15 at 7:50









      nicolas.bachelardnicolas.bachelard

      132




      132






















          1 Answer
          1






          active

          oldest

          votes


















          1












          $begingroup$

          As they are 3-diagonal, it's a shame to lose this property and attack it with full formalism (like LU). Complex symmetric matrices are not guaranteed to even be diagonalizable, but you can still require the characteristic polynomials of both matrices to be equal as a first necessary condition. Before even computing the polynomials, you can require the trace and determinant of both matrices to match:



          $$operatorname{Tr} A=0$$
          $$det A = g_1^2g_2^2-g_1g_2 kappa^2+kappa^4 $$



          And finally, the full polynomial; for a tridiagonal matrix, in the definition of the determinant the only permutations permitted are those with elements displaced at most by one, so there are not a lot of options, just (1234), (1243), (1324), (2134), (2143). Additionally, the symmetries of the first matrix make it assume a very special form (which also includes the quick conditions above):



          $$det (A-lambda I)=(lambda^2-g_1^2)(lambda^2-g_2^2)
          -kappa^2(3lambda^2-i (g_1+g_2)lambda+g_1g_2)+kappa^4=0$$



          On the second matrix, the equality of characteristic polynomial will give you 4 equations to fulfill, so unless some of them happen to be trivially true, you have 4 conditions for 4 parameters which should determine the system exactly, or prove it is impossible to do. You immediately see the linear equation from trace condition (or absence of $lambda^3$ term), and the rest follow:



          $$x_1+x_2+x_3+x_4=0$$



          $$x_1x_2x_3x_4+kappa_1^2 x_3 x_4+kappa_2^2 x_1 x_4+kappa_3^2 x_1 x_2+kappa_1^2kappa_3^2=g_1^2g_2^2-g_1g_2 kappa^2+kappa^4$$



          $$x_1x_2x_3+x_1 x_2 x_4+x_1 x_3 x_4+x_2 x_3 x_4+kappa_1^2(x_3+x_4)+kappa_2^2 (x_1+x_4)+kappa_3^2(x_1+x_2)=kappa^2(g_1+g_2)$$



          $$x_1x_2+x_1 x_3+x_2 x_3+x_1 x_4+x_2 x_4+x_3 x_4+kappa_1^2+kappa_2^2+kappa_3^2=3kappa^2-g_1^2-g_2^2$$



          The first condition is the most helpful, because squaring it simplifies the last condition:



          $$x_1x_2+x_1 x_3+x_2 x_3+x_1 x_4+x_2 x_4+x_3 x_4=-frac{1}{2}(x_1^2+x_2^2+x_3^2+x_4^2)=-frac{||vec{x}||^2}{2}$$



          $$-frac{||vec{x}||^2}{2}=3kappa^2-kappa_1^2-kappa_2^2-kappa_3^2-g_1^2-g_2^2$$



          Similar manipulation of cubic term gives you a similar simplification:
          $$x_1x_2x_3+x_1 x_2 x_4+x_1 x_3 x_4+x_2 x_3 x_4=frac{1}{3}(x_1^3+x_2^3+x_3^3+x_4^3)$$



          $$frac{1}{3}(x_1^3+x_2^3+x_3^3+x_4^3)+kappa_1^2(x_3+x_4)+kappa_2^2 (x_1+x_4)+kappa_3^2(x_1+x_2)=kappa^2(g_1+g_2)$$



          This somewhat simplifies the system of equations.





          In very special cases when eigenvalues are degenerate, you can double check if the Jordan form is the same (I don't know what happens to the conditions, if they remain independent or not).






          share|cite|improve this answer











          $endgroup$













          • $begingroup$
            Thanks a lot!!!
            $endgroup$
            – nicolas.bachelard
            Jan 16 at 8:05











          Your Answer





          StackExchange.ifUsing("editor", function () {
          return StackExchange.using("mathjaxEditing", function () {
          StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
          StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
          });
          });
          }, "mathjax-editing");

          StackExchange.ready(function() {
          var channelOptions = {
          tags: "".split(" "),
          id: "69"
          };
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function() {
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled) {
          StackExchange.using("snippets", function() {
          createEditor();
          });
          }
          else {
          createEditor();
          }
          });

          function createEditor() {
          StackExchange.prepareEditor({
          heartbeatType: 'answer',
          autoActivateHeartbeat: false,
          convertImagesToLinks: true,
          noModals: true,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: 10,
          bindNavPrevention: true,
          postfix: "",
          imageUploader: {
          brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
          contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
          allowUrls: true
          },
          noCode: true, onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          });


          }
          });














          draft saved

          draft discarded


















          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3074171%2fsimilarity-of-two-tridiagonal-matrices%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown

























          1 Answer
          1






          active

          oldest

          votes








          1 Answer
          1






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes









          1












          $begingroup$

          As they are 3-diagonal, it's a shame to lose this property and attack it with full formalism (like LU). Complex symmetric matrices are not guaranteed to even be diagonalizable, but you can still require the characteristic polynomials of both matrices to be equal as a first necessary condition. Before even computing the polynomials, you can require the trace and determinant of both matrices to match:



          $$operatorname{Tr} A=0$$
          $$det A = g_1^2g_2^2-g_1g_2 kappa^2+kappa^4 $$



          And finally, the full polynomial; for a tridiagonal matrix, in the definition of the determinant the only permutations permitted are those with elements displaced at most by one, so there are not a lot of options, just (1234), (1243), (1324), (2134), (2143). Additionally, the symmetries of the first matrix make it assume a very special form (which also includes the quick conditions above):



          $$det (A-lambda I)=(lambda^2-g_1^2)(lambda^2-g_2^2)
          -kappa^2(3lambda^2-i (g_1+g_2)lambda+g_1g_2)+kappa^4=0$$



          On the second matrix, the equality of characteristic polynomial will give you 4 equations to fulfill, so unless some of them happen to be trivially true, you have 4 conditions for 4 parameters which should determine the system exactly, or prove it is impossible to do. You immediately see the linear equation from trace condition (or absence of $lambda^3$ term), and the rest follow:



          $$x_1+x_2+x_3+x_4=0$$



          $$x_1x_2x_3x_4+kappa_1^2 x_3 x_4+kappa_2^2 x_1 x_4+kappa_3^2 x_1 x_2+kappa_1^2kappa_3^2=g_1^2g_2^2-g_1g_2 kappa^2+kappa^4$$



          $$x_1x_2x_3+x_1 x_2 x_4+x_1 x_3 x_4+x_2 x_3 x_4+kappa_1^2(x_3+x_4)+kappa_2^2 (x_1+x_4)+kappa_3^2(x_1+x_2)=kappa^2(g_1+g_2)$$



          $$x_1x_2+x_1 x_3+x_2 x_3+x_1 x_4+x_2 x_4+x_3 x_4+kappa_1^2+kappa_2^2+kappa_3^2=3kappa^2-g_1^2-g_2^2$$



          The first condition is the most helpful, because squaring it simplifies the last condition:



          $$x_1x_2+x_1 x_3+x_2 x_3+x_1 x_4+x_2 x_4+x_3 x_4=-frac{1}{2}(x_1^2+x_2^2+x_3^2+x_4^2)=-frac{||vec{x}||^2}{2}$$



          $$-frac{||vec{x}||^2}{2}=3kappa^2-kappa_1^2-kappa_2^2-kappa_3^2-g_1^2-g_2^2$$



          Similar manipulation of cubic term gives you a similar simplification:
          $$x_1x_2x_3+x_1 x_2 x_4+x_1 x_3 x_4+x_2 x_3 x_4=frac{1}{3}(x_1^3+x_2^3+x_3^3+x_4^3)$$



          $$frac{1}{3}(x_1^3+x_2^3+x_3^3+x_4^3)+kappa_1^2(x_3+x_4)+kappa_2^2 (x_1+x_4)+kappa_3^2(x_1+x_2)=kappa^2(g_1+g_2)$$



          This somewhat simplifies the system of equations.





          In very special cases when eigenvalues are degenerate, you can double check if the Jordan form is the same (I don't know what happens to the conditions, if they remain independent or not).






          share|cite|improve this answer











          $endgroup$













          • $begingroup$
            Thanks a lot!!!
            $endgroup$
            – nicolas.bachelard
            Jan 16 at 8:05
















          1












          $begingroup$

          As they are 3-diagonal, it's a shame to lose this property and attack it with full formalism (like LU). Complex symmetric matrices are not guaranteed to even be diagonalizable, but you can still require the characteristic polynomials of both matrices to be equal as a first necessary condition. Before even computing the polynomials, you can require the trace and determinant of both matrices to match:



          $$operatorname{Tr} A=0$$
          $$det A = g_1^2g_2^2-g_1g_2 kappa^2+kappa^4 $$



          And finally, the full polynomial; for a tridiagonal matrix, in the definition of the determinant the only permutations permitted are those with elements displaced at most by one, so there are not a lot of options, just (1234), (1243), (1324), (2134), (2143). Additionally, the symmetries of the first matrix make it assume a very special form (which also includes the quick conditions above):



          $$det (A-lambda I)=(lambda^2-g_1^2)(lambda^2-g_2^2)
          -kappa^2(3lambda^2-i (g_1+g_2)lambda+g_1g_2)+kappa^4=0$$



          On the second matrix, the equality of characteristic polynomial will give you 4 equations to fulfill, so unless some of them happen to be trivially true, you have 4 conditions for 4 parameters which should determine the system exactly, or prove it is impossible to do. You immediately see the linear equation from trace condition (or absence of $lambda^3$ term), and the rest follow:



          $$x_1+x_2+x_3+x_4=0$$



          $$x_1x_2x_3x_4+kappa_1^2 x_3 x_4+kappa_2^2 x_1 x_4+kappa_3^2 x_1 x_2+kappa_1^2kappa_3^2=g_1^2g_2^2-g_1g_2 kappa^2+kappa^4$$



          $$x_1x_2x_3+x_1 x_2 x_4+x_1 x_3 x_4+x_2 x_3 x_4+kappa_1^2(x_3+x_4)+kappa_2^2 (x_1+x_4)+kappa_3^2(x_1+x_2)=kappa^2(g_1+g_2)$$



          $$x_1x_2+x_1 x_3+x_2 x_3+x_1 x_4+x_2 x_4+x_3 x_4+kappa_1^2+kappa_2^2+kappa_3^2=3kappa^2-g_1^2-g_2^2$$



          The first condition is the most helpful, because squaring it simplifies the last condition:



          $$x_1x_2+x_1 x_3+x_2 x_3+x_1 x_4+x_2 x_4+x_3 x_4=-frac{1}{2}(x_1^2+x_2^2+x_3^2+x_4^2)=-frac{||vec{x}||^2}{2}$$



          $$-frac{||vec{x}||^2}{2}=3kappa^2-kappa_1^2-kappa_2^2-kappa_3^2-g_1^2-g_2^2$$



          Similar manipulation of cubic term gives you a similar simplification:
          $$x_1x_2x_3+x_1 x_2 x_4+x_1 x_3 x_4+x_2 x_3 x_4=frac{1}{3}(x_1^3+x_2^3+x_3^3+x_4^3)$$



          $$frac{1}{3}(x_1^3+x_2^3+x_3^3+x_4^3)+kappa_1^2(x_3+x_4)+kappa_2^2 (x_1+x_4)+kappa_3^2(x_1+x_2)=kappa^2(g_1+g_2)$$



          This somewhat simplifies the system of equations.





          In very special cases when eigenvalues are degenerate, you can double check if the Jordan form is the same (I don't know what happens to the conditions, if they remain independent or not).






          share|cite|improve this answer











          $endgroup$













          • $begingroup$
            Thanks a lot!!!
            $endgroup$
            – nicolas.bachelard
            Jan 16 at 8:05














          1












          1








          1





          $begingroup$

          As they are 3-diagonal, it's a shame to lose this property and attack it with full formalism (like LU). Complex symmetric matrices are not guaranteed to even be diagonalizable, but you can still require the characteristic polynomials of both matrices to be equal as a first necessary condition. Before even computing the polynomials, you can require the trace and determinant of both matrices to match:



          $$operatorname{Tr} A=0$$
          $$det A = g_1^2g_2^2-g_1g_2 kappa^2+kappa^4 $$



          And finally, the full polynomial; for a tridiagonal matrix, in the definition of the determinant the only permutations permitted are those with elements displaced at most by one, so there are not a lot of options, just (1234), (1243), (1324), (2134), (2143). Additionally, the symmetries of the first matrix make it assume a very special form (which also includes the quick conditions above):



          $$det (A-lambda I)=(lambda^2-g_1^2)(lambda^2-g_2^2)
          -kappa^2(3lambda^2-i (g_1+g_2)lambda+g_1g_2)+kappa^4=0$$



          On the second matrix, the equality of characteristic polynomial will give you 4 equations to fulfill, so unless some of them happen to be trivially true, you have 4 conditions for 4 parameters which should determine the system exactly, or prove it is impossible to do. You immediately see the linear equation from trace condition (or absence of $lambda^3$ term), and the rest follow:



          $$x_1+x_2+x_3+x_4=0$$



          $$x_1x_2x_3x_4+kappa_1^2 x_3 x_4+kappa_2^2 x_1 x_4+kappa_3^2 x_1 x_2+kappa_1^2kappa_3^2=g_1^2g_2^2-g_1g_2 kappa^2+kappa^4$$



          $$x_1x_2x_3+x_1 x_2 x_4+x_1 x_3 x_4+x_2 x_3 x_4+kappa_1^2(x_3+x_4)+kappa_2^2 (x_1+x_4)+kappa_3^2(x_1+x_2)=kappa^2(g_1+g_2)$$



          $$x_1x_2+x_1 x_3+x_2 x_3+x_1 x_4+x_2 x_4+x_3 x_4+kappa_1^2+kappa_2^2+kappa_3^2=3kappa^2-g_1^2-g_2^2$$



          The first condition is the most helpful, because squaring it simplifies the last condition:



          $$x_1x_2+x_1 x_3+x_2 x_3+x_1 x_4+x_2 x_4+x_3 x_4=-frac{1}{2}(x_1^2+x_2^2+x_3^2+x_4^2)=-frac{||vec{x}||^2}{2}$$



          $$-frac{||vec{x}||^2}{2}=3kappa^2-kappa_1^2-kappa_2^2-kappa_3^2-g_1^2-g_2^2$$



          Similar manipulation of cubic term gives you a similar simplification:
          $$x_1x_2x_3+x_1 x_2 x_4+x_1 x_3 x_4+x_2 x_3 x_4=frac{1}{3}(x_1^3+x_2^3+x_3^3+x_4^3)$$



          $$frac{1}{3}(x_1^3+x_2^3+x_3^3+x_4^3)+kappa_1^2(x_3+x_4)+kappa_2^2 (x_1+x_4)+kappa_3^2(x_1+x_2)=kappa^2(g_1+g_2)$$



          This somewhat simplifies the system of equations.





          In very special cases when eigenvalues are degenerate, you can double check if the Jordan form is the same (I don't know what happens to the conditions, if they remain independent or not).






          share|cite|improve this answer











          $endgroup$



          As they are 3-diagonal, it's a shame to lose this property and attack it with full formalism (like LU). Complex symmetric matrices are not guaranteed to even be diagonalizable, but you can still require the characteristic polynomials of both matrices to be equal as a first necessary condition. Before even computing the polynomials, you can require the trace and determinant of both matrices to match:



          $$operatorname{Tr} A=0$$
          $$det A = g_1^2g_2^2-g_1g_2 kappa^2+kappa^4 $$



          And finally, the full polynomial; for a tridiagonal matrix, in the definition of the determinant the only permutations permitted are those with elements displaced at most by one, so there are not a lot of options, just (1234), (1243), (1324), (2134), (2143). Additionally, the symmetries of the first matrix make it assume a very special form (which also includes the quick conditions above):



          $$det (A-lambda I)=(lambda^2-g_1^2)(lambda^2-g_2^2)
          -kappa^2(3lambda^2-i (g_1+g_2)lambda+g_1g_2)+kappa^4=0$$



          On the second matrix, the equality of characteristic polynomial will give you 4 equations to fulfill, so unless some of them happen to be trivially true, you have 4 conditions for 4 parameters which should determine the system exactly, or prove it is impossible to do. You immediately see the linear equation from trace condition (or absence of $lambda^3$ term), and the rest follow:



          $$x_1+x_2+x_3+x_4=0$$



          $$x_1x_2x_3x_4+kappa_1^2 x_3 x_4+kappa_2^2 x_1 x_4+kappa_3^2 x_1 x_2+kappa_1^2kappa_3^2=g_1^2g_2^2-g_1g_2 kappa^2+kappa^4$$



          $$x_1x_2x_3+x_1 x_2 x_4+x_1 x_3 x_4+x_2 x_3 x_4+kappa_1^2(x_3+x_4)+kappa_2^2 (x_1+x_4)+kappa_3^2(x_1+x_2)=kappa^2(g_1+g_2)$$



          $$x_1x_2+x_1 x_3+x_2 x_3+x_1 x_4+x_2 x_4+x_3 x_4+kappa_1^2+kappa_2^2+kappa_3^2=3kappa^2-g_1^2-g_2^2$$



          The first condition is the most helpful, because squaring it simplifies the last condition:



          $$x_1x_2+x_1 x_3+x_2 x_3+x_1 x_4+x_2 x_4+x_3 x_4=-frac{1}{2}(x_1^2+x_2^2+x_3^2+x_4^2)=-frac{||vec{x}||^2}{2}$$



          $$-frac{||vec{x}||^2}{2}=3kappa^2-kappa_1^2-kappa_2^2-kappa_3^2-g_1^2-g_2^2$$



          Similar manipulation of cubic term gives you a similar simplification:
          $$x_1x_2x_3+x_1 x_2 x_4+x_1 x_3 x_4+x_2 x_3 x_4=frac{1}{3}(x_1^3+x_2^3+x_3^3+x_4^3)$$



          $$frac{1}{3}(x_1^3+x_2^3+x_3^3+x_4^3)+kappa_1^2(x_3+x_4)+kappa_2^2 (x_1+x_4)+kappa_3^2(x_1+x_2)=kappa^2(g_1+g_2)$$



          This somewhat simplifies the system of equations.





          In very special cases when eigenvalues are degenerate, you can double check if the Jordan form is the same (I don't know what happens to the conditions, if they remain independent or not).







          share|cite|improve this answer














          share|cite|improve this answer



          share|cite|improve this answer








          edited Jan 15 at 9:40

























          answered Jan 15 at 9:23









          orionorion

          13.5k11837




          13.5k11837












          • $begingroup$
            Thanks a lot!!!
            $endgroup$
            – nicolas.bachelard
            Jan 16 at 8:05


















          • $begingroup$
            Thanks a lot!!!
            $endgroup$
            – nicolas.bachelard
            Jan 16 at 8:05
















          $begingroup$
          Thanks a lot!!!
          $endgroup$
          – nicolas.bachelard
          Jan 16 at 8:05




          $begingroup$
          Thanks a lot!!!
          $endgroup$
          – nicolas.bachelard
          Jan 16 at 8:05


















          draft saved

          draft discarded




















































          Thanks for contributing an answer to Mathematics Stack Exchange!


          • Please be sure to answer the question. Provide details and share your research!

          But avoid



          • Asking for help, clarification, or responding to other answers.

          • Making statements based on opinion; back them up with references or personal experience.


          Use MathJax to format equations. MathJax reference.


          To learn more, see our tips on writing great answers.




          draft saved


          draft discarded














          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3074171%2fsimilarity-of-two-tridiagonal-matrices%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown





















































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown

































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown







          Popular posts from this blog

          Mario Kart Wii

          What does “Dominus providebit” mean?

          Antonio Litta Visconti Arese