Efficient SQP with more equality constraints than parameters












2












$begingroup$


My question is both a math and (computer) programming one so answers related to either are fine.



Problem Setup



I have the nonlinear programming problem



$$
begin{aligned}
min ;; &f(X) \
text{s.t.} ;; &g(X) leq 0 \
&h(X) = 0
end{aligned}
$$



where $X in mathbb{R}^k$, $g(X) rightarrow mathbb{R}^n$ and $h(X) rightarrow mathbb{R}^m$. Also $m > k$ so I have more constraints than parameters. I've already confirmed that SQP is well-suited for this problem and would like to continue using SQP, especially because it gives estimates for the KKT multipliers and they are physically meaningful in my problem. Since there are more equality constraints than parameters, most solvers will refuse to solve this problem (exit mode 2). MATLAB's fmincon with algorithm=sqp actually works quite well in this situation. It seems to be using a type of active-set method where it only activates a linearly independent set of constraints. I suspect this because the KKT/Lagrange multipliers output by fmincon are 0 for the dependent constraints meaning they're not active.



Computer Program Solution



I've been on a search for NLP solvers either directly in the Python ecosystem, or are written in another language but have interfaces and wrappers in Python. My work is open-source so closed-source and paid solutions aren't suitable. If anybody can bring an NLP solver that can handle this type of problem to my attention, that will work.



Mathematical Solution



Since I've pretty much convinced myself that this problem is primarily due to a gap in available NLP solvers in Python, I've begun to write my own. At a high level, here's the strategy:




  1. Create Lagrangian function for the problem to introduce the KKT multipliers.

  2. Linearize the problem to create a QP subproblem.

  3. Perform LU decomposition on the resulting set of linear equality constraints and remove any dependent constraints.

  4. Solve the QP subproblem using a QP solver or some other convex strategy.

  5. Reconstruct KKT/Lagrange multipliers that were removed from step 3.

  6. Repeat at step 1 using the current KKT/Lagrange estimate unless convergence.


This strategy actually solves my problem quite well, but it is EXTREMELY SLOW for medium-large problems >30 parameters. I'm using LU decomposition because I'm led to believe its a fairly efficient algorithm for Gaussian elimination. I can't help but feel there is a more efficient algorithm for handling cases where there are more constraints than parameters, especially since MATLAB's sqp seems to do it well. If anybody has recommendations on how I change my procedure at a high level, then that would be greatly appreciated.










share|cite|improve this question









$endgroup$












  • $begingroup$
    I’m not sure it would be powerful enough, but have you tried SciPy’s scipy.optimize.minimize() function? Also, for your homebrew method, is the LU the bottleneck in computation time?
    $endgroup$
    – David M.
    Jan 24 at 4:47










  • $begingroup$
    Yup! minimize with slsqp uses Algorithm 733 which throws the exit mode 2 error. In my homebrew method, LU is the bottleneck. I'm using SciPy's LU which I believe calls the high performance FORTRAN routine. I think I need a completely different method for Gaussian elimination. Though I'm not familiar with the operations involved, separating the original matrix into lower and upper matrices seems too powerful. I end up multiplying the lower by the upper matrix after LU anyways.
    $endgroup$
    – Michael Sparapany
    Jan 24 at 17:14










  • $begingroup$
    Hmmm I'm not sure--my reference for this is Chapter 18 of Nocedal and Wright's book "Numerical Optimization". A free PDF is available from the publisher. Chapter 18 covers the finer points of SQP.
    $endgroup$
    – David M.
    Jan 25 at 21:19












  • $begingroup$
    Thanks for the source. Nothing is jumping out at me at the moment. So, right now I'm doing the LU decomposition in the larger SQP loop, outside of the QP subproblem. Do you think it might be a better idea to pass my dependent constraints INTO the QP subproblem and let the QP solver handle the dependencies? The QP solver I'm using cvxopt can't handle this scenario, but in my mind this actually might be a better solution.
    $endgroup$
    – Michael Sparapany
    Jan 28 at 1:07
















2












$begingroup$


My question is both a math and (computer) programming one so answers related to either are fine.



Problem Setup



I have the nonlinear programming problem



$$
begin{aligned}
min ;; &f(X) \
text{s.t.} ;; &g(X) leq 0 \
&h(X) = 0
end{aligned}
$$



where $X in mathbb{R}^k$, $g(X) rightarrow mathbb{R}^n$ and $h(X) rightarrow mathbb{R}^m$. Also $m > k$ so I have more constraints than parameters. I've already confirmed that SQP is well-suited for this problem and would like to continue using SQP, especially because it gives estimates for the KKT multipliers and they are physically meaningful in my problem. Since there are more equality constraints than parameters, most solvers will refuse to solve this problem (exit mode 2). MATLAB's fmincon with algorithm=sqp actually works quite well in this situation. It seems to be using a type of active-set method where it only activates a linearly independent set of constraints. I suspect this because the KKT/Lagrange multipliers output by fmincon are 0 for the dependent constraints meaning they're not active.



Computer Program Solution



I've been on a search for NLP solvers either directly in the Python ecosystem, or are written in another language but have interfaces and wrappers in Python. My work is open-source so closed-source and paid solutions aren't suitable. If anybody can bring an NLP solver that can handle this type of problem to my attention, that will work.



Mathematical Solution



Since I've pretty much convinced myself that this problem is primarily due to a gap in available NLP solvers in Python, I've begun to write my own. At a high level, here's the strategy:




  1. Create Lagrangian function for the problem to introduce the KKT multipliers.

  2. Linearize the problem to create a QP subproblem.

  3. Perform LU decomposition on the resulting set of linear equality constraints and remove any dependent constraints.

  4. Solve the QP subproblem using a QP solver or some other convex strategy.

  5. Reconstruct KKT/Lagrange multipliers that were removed from step 3.

  6. Repeat at step 1 using the current KKT/Lagrange estimate unless convergence.


This strategy actually solves my problem quite well, but it is EXTREMELY SLOW for medium-large problems >30 parameters. I'm using LU decomposition because I'm led to believe its a fairly efficient algorithm for Gaussian elimination. I can't help but feel there is a more efficient algorithm for handling cases where there are more constraints than parameters, especially since MATLAB's sqp seems to do it well. If anybody has recommendations on how I change my procedure at a high level, then that would be greatly appreciated.










share|cite|improve this question









$endgroup$












  • $begingroup$
    I’m not sure it would be powerful enough, but have you tried SciPy’s scipy.optimize.minimize() function? Also, for your homebrew method, is the LU the bottleneck in computation time?
    $endgroup$
    – David M.
    Jan 24 at 4:47










  • $begingroup$
    Yup! minimize with slsqp uses Algorithm 733 which throws the exit mode 2 error. In my homebrew method, LU is the bottleneck. I'm using SciPy's LU which I believe calls the high performance FORTRAN routine. I think I need a completely different method for Gaussian elimination. Though I'm not familiar with the operations involved, separating the original matrix into lower and upper matrices seems too powerful. I end up multiplying the lower by the upper matrix after LU anyways.
    $endgroup$
    – Michael Sparapany
    Jan 24 at 17:14










  • $begingroup$
    Hmmm I'm not sure--my reference for this is Chapter 18 of Nocedal and Wright's book "Numerical Optimization". A free PDF is available from the publisher. Chapter 18 covers the finer points of SQP.
    $endgroup$
    – David M.
    Jan 25 at 21:19












  • $begingroup$
    Thanks for the source. Nothing is jumping out at me at the moment. So, right now I'm doing the LU decomposition in the larger SQP loop, outside of the QP subproblem. Do you think it might be a better idea to pass my dependent constraints INTO the QP subproblem and let the QP solver handle the dependencies? The QP solver I'm using cvxopt can't handle this scenario, but in my mind this actually might be a better solution.
    $endgroup$
    – Michael Sparapany
    Jan 28 at 1:07














2












2








2





$begingroup$


My question is both a math and (computer) programming one so answers related to either are fine.



Problem Setup



I have the nonlinear programming problem



$$
begin{aligned}
min ;; &f(X) \
text{s.t.} ;; &g(X) leq 0 \
&h(X) = 0
end{aligned}
$$



where $X in mathbb{R}^k$, $g(X) rightarrow mathbb{R}^n$ and $h(X) rightarrow mathbb{R}^m$. Also $m > k$ so I have more constraints than parameters. I've already confirmed that SQP is well-suited for this problem and would like to continue using SQP, especially because it gives estimates for the KKT multipliers and they are physically meaningful in my problem. Since there are more equality constraints than parameters, most solvers will refuse to solve this problem (exit mode 2). MATLAB's fmincon with algorithm=sqp actually works quite well in this situation. It seems to be using a type of active-set method where it only activates a linearly independent set of constraints. I suspect this because the KKT/Lagrange multipliers output by fmincon are 0 for the dependent constraints meaning they're not active.



Computer Program Solution



I've been on a search for NLP solvers either directly in the Python ecosystem, or are written in another language but have interfaces and wrappers in Python. My work is open-source so closed-source and paid solutions aren't suitable. If anybody can bring an NLP solver that can handle this type of problem to my attention, that will work.



Mathematical Solution



Since I've pretty much convinced myself that this problem is primarily due to a gap in available NLP solvers in Python, I've begun to write my own. At a high level, here's the strategy:




  1. Create Lagrangian function for the problem to introduce the KKT multipliers.

  2. Linearize the problem to create a QP subproblem.

  3. Perform LU decomposition on the resulting set of linear equality constraints and remove any dependent constraints.

  4. Solve the QP subproblem using a QP solver or some other convex strategy.

  5. Reconstruct KKT/Lagrange multipliers that were removed from step 3.

  6. Repeat at step 1 using the current KKT/Lagrange estimate unless convergence.


This strategy actually solves my problem quite well, but it is EXTREMELY SLOW for medium-large problems >30 parameters. I'm using LU decomposition because I'm led to believe its a fairly efficient algorithm for Gaussian elimination. I can't help but feel there is a more efficient algorithm for handling cases where there are more constraints than parameters, especially since MATLAB's sqp seems to do it well. If anybody has recommendations on how I change my procedure at a high level, then that would be greatly appreciated.










share|cite|improve this question









$endgroup$




My question is both a math and (computer) programming one so answers related to either are fine.



Problem Setup



I have the nonlinear programming problem



$$
begin{aligned}
min ;; &f(X) \
text{s.t.} ;; &g(X) leq 0 \
&h(X) = 0
end{aligned}
$$



where $X in mathbb{R}^k$, $g(X) rightarrow mathbb{R}^n$ and $h(X) rightarrow mathbb{R}^m$. Also $m > k$ so I have more constraints than parameters. I've already confirmed that SQP is well-suited for this problem and would like to continue using SQP, especially because it gives estimates for the KKT multipliers and they are physically meaningful in my problem. Since there are more equality constraints than parameters, most solvers will refuse to solve this problem (exit mode 2). MATLAB's fmincon with algorithm=sqp actually works quite well in this situation. It seems to be using a type of active-set method where it only activates a linearly independent set of constraints. I suspect this because the KKT/Lagrange multipliers output by fmincon are 0 for the dependent constraints meaning they're not active.



Computer Program Solution



I've been on a search for NLP solvers either directly in the Python ecosystem, or are written in another language but have interfaces and wrappers in Python. My work is open-source so closed-source and paid solutions aren't suitable. If anybody can bring an NLP solver that can handle this type of problem to my attention, that will work.



Mathematical Solution



Since I've pretty much convinced myself that this problem is primarily due to a gap in available NLP solvers in Python, I've begun to write my own. At a high level, here's the strategy:




  1. Create Lagrangian function for the problem to introduce the KKT multipliers.

  2. Linearize the problem to create a QP subproblem.

  3. Perform LU decomposition on the resulting set of linear equality constraints and remove any dependent constraints.

  4. Solve the QP subproblem using a QP solver or some other convex strategy.

  5. Reconstruct KKT/Lagrange multipliers that were removed from step 3.

  6. Repeat at step 1 using the current KKT/Lagrange estimate unless convergence.


This strategy actually solves my problem quite well, but it is EXTREMELY SLOW for medium-large problems >30 parameters. I'm using LU decomposition because I'm led to believe its a fairly efficient algorithm for Gaussian elimination. I can't help but feel there is a more efficient algorithm for handling cases where there are more constraints than parameters, especially since MATLAB's sqp seems to do it well. If anybody has recommendations on how I change my procedure at a high level, then that would be greatly appreciated.







optimization nonlinear-optimization python programming






share|cite|improve this question













share|cite|improve this question











share|cite|improve this question




share|cite|improve this question










asked Jan 20 at 19:29









Michael SparapanyMichael Sparapany

1265




1265












  • $begingroup$
    I’m not sure it would be powerful enough, but have you tried SciPy’s scipy.optimize.minimize() function? Also, for your homebrew method, is the LU the bottleneck in computation time?
    $endgroup$
    – David M.
    Jan 24 at 4:47










  • $begingroup$
    Yup! minimize with slsqp uses Algorithm 733 which throws the exit mode 2 error. In my homebrew method, LU is the bottleneck. I'm using SciPy's LU which I believe calls the high performance FORTRAN routine. I think I need a completely different method for Gaussian elimination. Though I'm not familiar with the operations involved, separating the original matrix into lower and upper matrices seems too powerful. I end up multiplying the lower by the upper matrix after LU anyways.
    $endgroup$
    – Michael Sparapany
    Jan 24 at 17:14










  • $begingroup$
    Hmmm I'm not sure--my reference for this is Chapter 18 of Nocedal and Wright's book "Numerical Optimization". A free PDF is available from the publisher. Chapter 18 covers the finer points of SQP.
    $endgroup$
    – David M.
    Jan 25 at 21:19












  • $begingroup$
    Thanks for the source. Nothing is jumping out at me at the moment. So, right now I'm doing the LU decomposition in the larger SQP loop, outside of the QP subproblem. Do you think it might be a better idea to pass my dependent constraints INTO the QP subproblem and let the QP solver handle the dependencies? The QP solver I'm using cvxopt can't handle this scenario, but in my mind this actually might be a better solution.
    $endgroup$
    – Michael Sparapany
    Jan 28 at 1:07


















  • $begingroup$
    I’m not sure it would be powerful enough, but have you tried SciPy’s scipy.optimize.minimize() function? Also, for your homebrew method, is the LU the bottleneck in computation time?
    $endgroup$
    – David M.
    Jan 24 at 4:47










  • $begingroup$
    Yup! minimize with slsqp uses Algorithm 733 which throws the exit mode 2 error. In my homebrew method, LU is the bottleneck. I'm using SciPy's LU which I believe calls the high performance FORTRAN routine. I think I need a completely different method for Gaussian elimination. Though I'm not familiar with the operations involved, separating the original matrix into lower and upper matrices seems too powerful. I end up multiplying the lower by the upper matrix after LU anyways.
    $endgroup$
    – Michael Sparapany
    Jan 24 at 17:14










  • $begingroup$
    Hmmm I'm not sure--my reference for this is Chapter 18 of Nocedal and Wright's book "Numerical Optimization". A free PDF is available from the publisher. Chapter 18 covers the finer points of SQP.
    $endgroup$
    – David M.
    Jan 25 at 21:19












  • $begingroup$
    Thanks for the source. Nothing is jumping out at me at the moment. So, right now I'm doing the LU decomposition in the larger SQP loop, outside of the QP subproblem. Do you think it might be a better idea to pass my dependent constraints INTO the QP subproblem and let the QP solver handle the dependencies? The QP solver I'm using cvxopt can't handle this scenario, but in my mind this actually might be a better solution.
    $endgroup$
    – Michael Sparapany
    Jan 28 at 1:07
















$begingroup$
I’m not sure it would be powerful enough, but have you tried SciPy’s scipy.optimize.minimize() function? Also, for your homebrew method, is the LU the bottleneck in computation time?
$endgroup$
– David M.
Jan 24 at 4:47




$begingroup$
I’m not sure it would be powerful enough, but have you tried SciPy’s scipy.optimize.minimize() function? Also, for your homebrew method, is the LU the bottleneck in computation time?
$endgroup$
– David M.
Jan 24 at 4:47












$begingroup$
Yup! minimize with slsqp uses Algorithm 733 which throws the exit mode 2 error. In my homebrew method, LU is the bottleneck. I'm using SciPy's LU which I believe calls the high performance FORTRAN routine. I think I need a completely different method for Gaussian elimination. Though I'm not familiar with the operations involved, separating the original matrix into lower and upper matrices seems too powerful. I end up multiplying the lower by the upper matrix after LU anyways.
$endgroup$
– Michael Sparapany
Jan 24 at 17:14




$begingroup$
Yup! minimize with slsqp uses Algorithm 733 which throws the exit mode 2 error. In my homebrew method, LU is the bottleneck. I'm using SciPy's LU which I believe calls the high performance FORTRAN routine. I think I need a completely different method for Gaussian elimination. Though I'm not familiar with the operations involved, separating the original matrix into lower and upper matrices seems too powerful. I end up multiplying the lower by the upper matrix after LU anyways.
$endgroup$
– Michael Sparapany
Jan 24 at 17:14












$begingroup$
Hmmm I'm not sure--my reference for this is Chapter 18 of Nocedal and Wright's book "Numerical Optimization". A free PDF is available from the publisher. Chapter 18 covers the finer points of SQP.
$endgroup$
– David M.
Jan 25 at 21:19






$begingroup$
Hmmm I'm not sure--my reference for this is Chapter 18 of Nocedal and Wright's book "Numerical Optimization". A free PDF is available from the publisher. Chapter 18 covers the finer points of SQP.
$endgroup$
– David M.
Jan 25 at 21:19














$begingroup$
Thanks for the source. Nothing is jumping out at me at the moment. So, right now I'm doing the LU decomposition in the larger SQP loop, outside of the QP subproblem. Do you think it might be a better idea to pass my dependent constraints INTO the QP subproblem and let the QP solver handle the dependencies? The QP solver I'm using cvxopt can't handle this scenario, but in my mind this actually might be a better solution.
$endgroup$
– Michael Sparapany
Jan 28 at 1:07




$begingroup$
Thanks for the source. Nothing is jumping out at me at the moment. So, right now I'm doing the LU decomposition in the larger SQP loop, outside of the QP subproblem. Do you think it might be a better idea to pass my dependent constraints INTO the QP subproblem and let the QP solver handle the dependencies? The QP solver I'm using cvxopt can't handle this scenario, but in my mind this actually might be a better solution.
$endgroup$
– Michael Sparapany
Jan 28 at 1:07










0






active

oldest

votes











Your Answer





StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");

StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














draft saved

draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3081038%2fefficient-sqp-with-more-equality-constraints-than-parameters%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown

























0






active

oldest

votes








0






active

oldest

votes









active

oldest

votes






active

oldest

votes
















draft saved

draft discarded




















































Thanks for contributing an answer to Mathematics Stack Exchange!


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


Use MathJax to format equations. MathJax reference.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3081038%2fefficient-sqp-with-more-equality-constraints-than-parameters%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

Mario Kart Wii

What does “Dominus providebit” mean?

Antonio Litta Visconti Arese