Orthogonal Projection onto the Weighted $ {L}_{2} $ Norm Ball












4












$begingroup$


Projection onto the $ell$2 norm ball is known. Let the norm ball set reads
$C = left{x in mathbb{R}^n: left| x - c right|_2^2 leq d^2 right}$, where $c in mathbb{R}^n$ and $d in mathbb{R}$, then the projection onto this norm ball set can be shown as
begin{align}
Pi_{C} left ( y right) = c + frac{d}{max left{left| y - c right|_2, d right}} left( y - c right) .
end{align}




How to obtain a projection onto the weighted $ell$2 norm ball, where the set $C = left{x in mathbb{R}^n: left| x - c right|_W^2 leq d^2 right}$, where $W$ is a positive definite matrix?






Note: There is an attempt for weighted l2 norm projection, but we don't have the same problem definition (or may be I can't figure out yet how to massage that answer).





My partial attempt:



Following Brian's comments, I have made an attempt. But I am stuck and not able to compute the Lagrange multiplier.



The weighted $ell$2 projection problem can be expressed as
begin{align}
text{minimize}_{x in mathbb{R}^n} quad & left|y-xright|_2^2\
text{subject to }quad & left|x-cright|_W^2 leq d^2
end{align}



Forming the Lagrangian:
begin{align*}
Lleft(x, lambdaright)
&= left|y-xright|_2^2 + lambda left( left[ (x-c)^T W (x-c) right]- d^2 right) .
end{align*}



Then according to the stationarity condition of the KKT conditions,
begin{align*}
frac{partial L}{partial x} = 0 Rightarrow
&= -(y - x) + lambda left(W (x-c) right) = 0 \
&Leftrightarrow x = left( I + lambda W right)^{-1} left(lambda W c + y right).
end{align*}



Now I am stuck and don't know how to obtain $lambda$ in closed-form? I thought about considering the complementary slackness of the KKT conditions, but then it becomes too much involved for me.
Say $lambda > 0$, then
begin{align}
(x-c)^T W (x-c) = d^2 .
end{align}

If I plugin $x = left( I + lambda W right)^{-1} left(lambda W c + y right)$ in the above equation, then I don't know how to solve for $lambda$. If we can't find it in closed-form, then how to obtain this iteratively? How about if $W$ is a diagonal matrix, then can we obtain $lambda$ in closed-form?





Further attempt: If $W$ is a diagonal matrix, then following the link, $i$th coordinate of $x$ can be written as
begin{align}
x_i = frac{lambda W_i c_i + y_i}{1 + lambda W_i} .
end{align}



Now, plugging into the complementary slackness condition such that
begin{align}
(x-c)^T W (x-c) &= |W^{1/2} (x-c)|^2 = d^2 \
&Leftrightarrow sqrt{W_i} (x_i - c_i) = pm d \
&Leftrightarrow lambda = frac{pm 1}{d sqrt{W_i}} left(y_i - c_i right) - frac{1}{W_i} .
end{align}

Is this correct in case of $W$ is a diagonal matrix?



Question: Would $lambda$ be different for different coordinates of the vector? Hmm, should it not be a constant for all coordinates?










share|cite|improve this question











$endgroup$












  • $begingroup$
    It's relatively easy to set this up as a constrained optimization problem and apply the Lagrange multiplier method. If your $W$ matrix is diagonal, this simplifies to the case discussed in the question you linked to. Is your $W$ diagonal? If not, then the problem is still relatively easy to solve by iterative schemes.
    $endgroup$
    – Brian Borchers
    Jan 20 at 3:26










  • $begingroup$
    Thank you Brian. I have made an attempt based on your suggestion (please see above). But I don't know how to obtain $lambda$ in closed-form even if $W$ is a diagonal matrix. Can you help me how to solve for $lambda$? Any reference?
    $endgroup$
    – user550103
    Jan 20 at 8:05










  • $begingroup$
    Have you thought what the matrix is doing to space? Specifically to the ball?
    $endgroup$
    – Royi
    Jan 24 at 14:19










  • $begingroup$
    @Royi my impression is that this matrix $W$ are giving different weights to respective coordinates. So, the norm ball may look non-spherical depending on the considered weights. Please correct me if my impression is wrong.
    $endgroup$
    – user550103
    Jan 24 at 14:38










  • $begingroup$
    PSD Matrices has special shapes in space. I think it might assist you with the solution.
    $endgroup$
    – Royi
    Jan 24 at 14:51
















4












$begingroup$


Projection onto the $ell$2 norm ball is known. Let the norm ball set reads
$C = left{x in mathbb{R}^n: left| x - c right|_2^2 leq d^2 right}$, where $c in mathbb{R}^n$ and $d in mathbb{R}$, then the projection onto this norm ball set can be shown as
begin{align}
Pi_{C} left ( y right) = c + frac{d}{max left{left| y - c right|_2, d right}} left( y - c right) .
end{align}




How to obtain a projection onto the weighted $ell$2 norm ball, where the set $C = left{x in mathbb{R}^n: left| x - c right|_W^2 leq d^2 right}$, where $W$ is a positive definite matrix?






Note: There is an attempt for weighted l2 norm projection, but we don't have the same problem definition (or may be I can't figure out yet how to massage that answer).





My partial attempt:



Following Brian's comments, I have made an attempt. But I am stuck and not able to compute the Lagrange multiplier.



The weighted $ell$2 projection problem can be expressed as
begin{align}
text{minimize}_{x in mathbb{R}^n} quad & left|y-xright|_2^2\
text{subject to }quad & left|x-cright|_W^2 leq d^2
end{align}



Forming the Lagrangian:
begin{align*}
Lleft(x, lambdaright)
&= left|y-xright|_2^2 + lambda left( left[ (x-c)^T W (x-c) right]- d^2 right) .
end{align*}



Then according to the stationarity condition of the KKT conditions,
begin{align*}
frac{partial L}{partial x} = 0 Rightarrow
&= -(y - x) + lambda left(W (x-c) right) = 0 \
&Leftrightarrow x = left( I + lambda W right)^{-1} left(lambda W c + y right).
end{align*}



Now I am stuck and don't know how to obtain $lambda$ in closed-form? I thought about considering the complementary slackness of the KKT conditions, but then it becomes too much involved for me.
Say $lambda > 0$, then
begin{align}
(x-c)^T W (x-c) = d^2 .
end{align}

If I plugin $x = left( I + lambda W right)^{-1} left(lambda W c + y right)$ in the above equation, then I don't know how to solve for $lambda$. If we can't find it in closed-form, then how to obtain this iteratively? How about if $W$ is a diagonal matrix, then can we obtain $lambda$ in closed-form?





Further attempt: If $W$ is a diagonal matrix, then following the link, $i$th coordinate of $x$ can be written as
begin{align}
x_i = frac{lambda W_i c_i + y_i}{1 + lambda W_i} .
end{align}



Now, plugging into the complementary slackness condition such that
begin{align}
(x-c)^T W (x-c) &= |W^{1/2} (x-c)|^2 = d^2 \
&Leftrightarrow sqrt{W_i} (x_i - c_i) = pm d \
&Leftrightarrow lambda = frac{pm 1}{d sqrt{W_i}} left(y_i - c_i right) - frac{1}{W_i} .
end{align}

Is this correct in case of $W$ is a diagonal matrix?



Question: Would $lambda$ be different for different coordinates of the vector? Hmm, should it not be a constant for all coordinates?










share|cite|improve this question











$endgroup$












  • $begingroup$
    It's relatively easy to set this up as a constrained optimization problem and apply the Lagrange multiplier method. If your $W$ matrix is diagonal, this simplifies to the case discussed in the question you linked to. Is your $W$ diagonal? If not, then the problem is still relatively easy to solve by iterative schemes.
    $endgroup$
    – Brian Borchers
    Jan 20 at 3:26










  • $begingroup$
    Thank you Brian. I have made an attempt based on your suggestion (please see above). But I don't know how to obtain $lambda$ in closed-form even if $W$ is a diagonal matrix. Can you help me how to solve for $lambda$? Any reference?
    $endgroup$
    – user550103
    Jan 20 at 8:05










  • $begingroup$
    Have you thought what the matrix is doing to space? Specifically to the ball?
    $endgroup$
    – Royi
    Jan 24 at 14:19










  • $begingroup$
    @Royi my impression is that this matrix $W$ are giving different weights to respective coordinates. So, the norm ball may look non-spherical depending on the considered weights. Please correct me if my impression is wrong.
    $endgroup$
    – user550103
    Jan 24 at 14:38










  • $begingroup$
    PSD Matrices has special shapes in space. I think it might assist you with the solution.
    $endgroup$
    – Royi
    Jan 24 at 14:51














4












4








4


1



$begingroup$


Projection onto the $ell$2 norm ball is known. Let the norm ball set reads
$C = left{x in mathbb{R}^n: left| x - c right|_2^2 leq d^2 right}$, where $c in mathbb{R}^n$ and $d in mathbb{R}$, then the projection onto this norm ball set can be shown as
begin{align}
Pi_{C} left ( y right) = c + frac{d}{max left{left| y - c right|_2, d right}} left( y - c right) .
end{align}




How to obtain a projection onto the weighted $ell$2 norm ball, where the set $C = left{x in mathbb{R}^n: left| x - c right|_W^2 leq d^2 right}$, where $W$ is a positive definite matrix?






Note: There is an attempt for weighted l2 norm projection, but we don't have the same problem definition (or may be I can't figure out yet how to massage that answer).





My partial attempt:



Following Brian's comments, I have made an attempt. But I am stuck and not able to compute the Lagrange multiplier.



The weighted $ell$2 projection problem can be expressed as
begin{align}
text{minimize}_{x in mathbb{R}^n} quad & left|y-xright|_2^2\
text{subject to }quad & left|x-cright|_W^2 leq d^2
end{align}



Forming the Lagrangian:
begin{align*}
Lleft(x, lambdaright)
&= left|y-xright|_2^2 + lambda left( left[ (x-c)^T W (x-c) right]- d^2 right) .
end{align*}



Then according to the stationarity condition of the KKT conditions,
begin{align*}
frac{partial L}{partial x} = 0 Rightarrow
&= -(y - x) + lambda left(W (x-c) right) = 0 \
&Leftrightarrow x = left( I + lambda W right)^{-1} left(lambda W c + y right).
end{align*}



Now I am stuck and don't know how to obtain $lambda$ in closed-form? I thought about considering the complementary slackness of the KKT conditions, but then it becomes too much involved for me.
Say $lambda > 0$, then
begin{align}
(x-c)^T W (x-c) = d^2 .
end{align}

If I plugin $x = left( I + lambda W right)^{-1} left(lambda W c + y right)$ in the above equation, then I don't know how to solve for $lambda$. If we can't find it in closed-form, then how to obtain this iteratively? How about if $W$ is a diagonal matrix, then can we obtain $lambda$ in closed-form?





Further attempt: If $W$ is a diagonal matrix, then following the link, $i$th coordinate of $x$ can be written as
begin{align}
x_i = frac{lambda W_i c_i + y_i}{1 + lambda W_i} .
end{align}



Now, plugging into the complementary slackness condition such that
begin{align}
(x-c)^T W (x-c) &= |W^{1/2} (x-c)|^2 = d^2 \
&Leftrightarrow sqrt{W_i} (x_i - c_i) = pm d \
&Leftrightarrow lambda = frac{pm 1}{d sqrt{W_i}} left(y_i - c_i right) - frac{1}{W_i} .
end{align}

Is this correct in case of $W$ is a diagonal matrix?



Question: Would $lambda$ be different for different coordinates of the vector? Hmm, should it not be a constant for all coordinates?










share|cite|improve this question











$endgroup$




Projection onto the $ell$2 norm ball is known. Let the norm ball set reads
$C = left{x in mathbb{R}^n: left| x - c right|_2^2 leq d^2 right}$, where $c in mathbb{R}^n$ and $d in mathbb{R}$, then the projection onto this norm ball set can be shown as
begin{align}
Pi_{C} left ( y right) = c + frac{d}{max left{left| y - c right|_2, d right}} left( y - c right) .
end{align}




How to obtain a projection onto the weighted $ell$2 norm ball, where the set $C = left{x in mathbb{R}^n: left| x - c right|_W^2 leq d^2 right}$, where $W$ is a positive definite matrix?






Note: There is an attempt for weighted l2 norm projection, but we don't have the same problem definition (or may be I can't figure out yet how to massage that answer).





My partial attempt:



Following Brian's comments, I have made an attempt. But I am stuck and not able to compute the Lagrange multiplier.



The weighted $ell$2 projection problem can be expressed as
begin{align}
text{minimize}_{x in mathbb{R}^n} quad & left|y-xright|_2^2\
text{subject to }quad & left|x-cright|_W^2 leq d^2
end{align}



Forming the Lagrangian:
begin{align*}
Lleft(x, lambdaright)
&= left|y-xright|_2^2 + lambda left( left[ (x-c)^T W (x-c) right]- d^2 right) .
end{align*}



Then according to the stationarity condition of the KKT conditions,
begin{align*}
frac{partial L}{partial x} = 0 Rightarrow
&= -(y - x) + lambda left(W (x-c) right) = 0 \
&Leftrightarrow x = left( I + lambda W right)^{-1} left(lambda W c + y right).
end{align*}



Now I am stuck and don't know how to obtain $lambda$ in closed-form? I thought about considering the complementary slackness of the KKT conditions, but then it becomes too much involved for me.
Say $lambda > 0$, then
begin{align}
(x-c)^T W (x-c) = d^2 .
end{align}

If I plugin $x = left( I + lambda W right)^{-1} left(lambda W c + y right)$ in the above equation, then I don't know how to solve for $lambda$. If we can't find it in closed-form, then how to obtain this iteratively? How about if $W$ is a diagonal matrix, then can we obtain $lambda$ in closed-form?





Further attempt: If $W$ is a diagonal matrix, then following the link, $i$th coordinate of $x$ can be written as
begin{align}
x_i = frac{lambda W_i c_i + y_i}{1 + lambda W_i} .
end{align}



Now, plugging into the complementary slackness condition such that
begin{align}
(x-c)^T W (x-c) &= |W^{1/2} (x-c)|^2 = d^2 \
&Leftrightarrow sqrt{W_i} (x_i - c_i) = pm d \
&Leftrightarrow lambda = frac{pm 1}{d sqrt{W_i}} left(y_i - c_i right) - frac{1}{W_i} .
end{align}

Is this correct in case of $W$ is a diagonal matrix?



Question: Would $lambda$ be different for different coordinates of the vector? Hmm, should it not be a constant for all coordinates?







optimization convex-optimization nonlinear-optimization






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Feb 2 at 20:05









Royi

3,49012352




3,49012352










asked Jan 19 at 14:25









user550103user550103

7581315




7581315












  • $begingroup$
    It's relatively easy to set this up as a constrained optimization problem and apply the Lagrange multiplier method. If your $W$ matrix is diagonal, this simplifies to the case discussed in the question you linked to. Is your $W$ diagonal? If not, then the problem is still relatively easy to solve by iterative schemes.
    $endgroup$
    – Brian Borchers
    Jan 20 at 3:26










  • $begingroup$
    Thank you Brian. I have made an attempt based on your suggestion (please see above). But I don't know how to obtain $lambda$ in closed-form even if $W$ is a diagonal matrix. Can you help me how to solve for $lambda$? Any reference?
    $endgroup$
    – user550103
    Jan 20 at 8:05










  • $begingroup$
    Have you thought what the matrix is doing to space? Specifically to the ball?
    $endgroup$
    – Royi
    Jan 24 at 14:19










  • $begingroup$
    @Royi my impression is that this matrix $W$ are giving different weights to respective coordinates. So, the norm ball may look non-spherical depending on the considered weights. Please correct me if my impression is wrong.
    $endgroup$
    – user550103
    Jan 24 at 14:38










  • $begingroup$
    PSD Matrices has special shapes in space. I think it might assist you with the solution.
    $endgroup$
    – Royi
    Jan 24 at 14:51


















  • $begingroup$
    It's relatively easy to set this up as a constrained optimization problem and apply the Lagrange multiplier method. If your $W$ matrix is diagonal, this simplifies to the case discussed in the question you linked to. Is your $W$ diagonal? If not, then the problem is still relatively easy to solve by iterative schemes.
    $endgroup$
    – Brian Borchers
    Jan 20 at 3:26










  • $begingroup$
    Thank you Brian. I have made an attempt based on your suggestion (please see above). But I don't know how to obtain $lambda$ in closed-form even if $W$ is a diagonal matrix. Can you help me how to solve for $lambda$? Any reference?
    $endgroup$
    – user550103
    Jan 20 at 8:05










  • $begingroup$
    Have you thought what the matrix is doing to space? Specifically to the ball?
    $endgroup$
    – Royi
    Jan 24 at 14:19










  • $begingroup$
    @Royi my impression is that this matrix $W$ are giving different weights to respective coordinates. So, the norm ball may look non-spherical depending on the considered weights. Please correct me if my impression is wrong.
    $endgroup$
    – user550103
    Jan 24 at 14:38










  • $begingroup$
    PSD Matrices has special shapes in space. I think it might assist you with the solution.
    $endgroup$
    – Royi
    Jan 24 at 14:51
















$begingroup$
It's relatively easy to set this up as a constrained optimization problem and apply the Lagrange multiplier method. If your $W$ matrix is diagonal, this simplifies to the case discussed in the question you linked to. Is your $W$ diagonal? If not, then the problem is still relatively easy to solve by iterative schemes.
$endgroup$
– Brian Borchers
Jan 20 at 3:26




$begingroup$
It's relatively easy to set this up as a constrained optimization problem and apply the Lagrange multiplier method. If your $W$ matrix is diagonal, this simplifies to the case discussed in the question you linked to. Is your $W$ diagonal? If not, then the problem is still relatively easy to solve by iterative schemes.
$endgroup$
– Brian Borchers
Jan 20 at 3:26












$begingroup$
Thank you Brian. I have made an attempt based on your suggestion (please see above). But I don't know how to obtain $lambda$ in closed-form even if $W$ is a diagonal matrix. Can you help me how to solve for $lambda$? Any reference?
$endgroup$
– user550103
Jan 20 at 8:05




$begingroup$
Thank you Brian. I have made an attempt based on your suggestion (please see above). But I don't know how to obtain $lambda$ in closed-form even if $W$ is a diagonal matrix. Can you help me how to solve for $lambda$? Any reference?
$endgroup$
– user550103
Jan 20 at 8:05












$begingroup$
Have you thought what the matrix is doing to space? Specifically to the ball?
$endgroup$
– Royi
Jan 24 at 14:19




$begingroup$
Have you thought what the matrix is doing to space? Specifically to the ball?
$endgroup$
– Royi
Jan 24 at 14:19












$begingroup$
@Royi my impression is that this matrix $W$ are giving different weights to respective coordinates. So, the norm ball may look non-spherical depending on the considered weights. Please correct me if my impression is wrong.
$endgroup$
– user550103
Jan 24 at 14:38




$begingroup$
@Royi my impression is that this matrix $W$ are giving different weights to respective coordinates. So, the norm ball may look non-spherical depending on the considered weights. Please correct me if my impression is wrong.
$endgroup$
– user550103
Jan 24 at 14:38












$begingroup$
PSD Matrices has special shapes in space. I think it might assist you with the solution.
$endgroup$
– Royi
Jan 24 at 14:51




$begingroup$
PSD Matrices has special shapes in space. I think it might assist you with the solution.
$endgroup$
– Royi
Jan 24 at 14:51










1 Answer
1






active

oldest

votes


















1












$begingroup$

The objective function is given by:



$$begin{align*}
arg min_{x} quad & frac{1}{2} {left| x - y right|}_{2}^{2} & text{} \
text{subject to} quad & {left( x - c right)}^{T} W left( x - c right) leq d
end{align*}$$



Case I - Diagonal Matrix



In this case the matrix $ W $ is diagonal with $ {w}_{ii} geq 0 $.



Let's assume we know how to solve this.



Remark

I could find a simple iterative method to find $ lambda $ for this case but not a closed form solution. Though I'd guess it is doable.



Case II - Positive Definite Matrix



In this case the matrix $ W $ a Positive Semi Definite (PSD) Matrix - $ W succeq 0 $.



Since $ W $ is a PSD matrix it can be written as (Eigen Decomposition):



$$ W = {P}^{T} D P $$



Where $ P $ is Unitary Matrix and $ D $ is Diagonal Matrix with $ {d}_{ii} geq 0 $.



Then one could rewrite the problem as:



$$begin{align*}
arg min_{x} quad & frac{1}{2} {left| x - y right|}_{2}^{2} & text{} \
text{subject to} quad & {left( x - c right)}^{T} {P}^{T} D P left( x - c right) leq d
end{align*}$$



Defining $ v = P x $, $ z = P y $ and $ e = P c $ one could rewrite the problem as:



$$begin{align*}
arg min_{x} quad & frac{1}{2} {left| x - y right|}_{2}^{2} & text{} \
text{subject to} quad & {left( v - e right)}^{T} D left( v - e right) leq d
end{align*}$$



Now, since $ P $ is Unitary Matrix which means it preserves the $ {L}_{2} $ norm and is invertible then $ frac{1}{2} {left| x - y right|}_{2}^{2} = frac{1}{2} {left| P left( x - y right) right|}_{2}^{2} = frac{1}{2} {left| v - z right|}_{2}^{2} $ and the whole problem becomes:



$$begin{align*}
arg min_{v} quad & frac{1}{2} {left| v - z right|}_{2}^{2} & text{} \
text{subject to} quad & {left( v - e right)}^{T} D left( v - e right) leq d
end{align*}$$



Where $ D $ is Diagonal Matrix which is exactly Case I we assume we know how to solve.



Code



I created a MATLAB code to solve the problem (The general).

The code is available at my StackExchange Mathematics Q3079400 GitHub Repository.






share|cite|improve this answer











$endgroup$













  • $begingroup$
    This looks good, Royi. I agree that we need to know if a closed form exists for the diagonal case. +1
    $endgroup$
    – user550103
    Feb 3 at 12:53










  • $begingroup$
    As I wrote, it is easy to solve with few iterations of Binary Search. The question is if that what you want? I hope @BrianBorchers have a closed form solution for that.
    $endgroup$
    – Royi
    Feb 3 at 16:04











Your Answer





StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");

StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














draft saved

draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3079400%2forthogonal-projection-onto-the-weighted-l-2-norm-ball%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown

























1 Answer
1






active

oldest

votes








1 Answer
1






active

oldest

votes









active

oldest

votes






active

oldest

votes









1












$begingroup$

The objective function is given by:



$$begin{align*}
arg min_{x} quad & frac{1}{2} {left| x - y right|}_{2}^{2} & text{} \
text{subject to} quad & {left( x - c right)}^{T} W left( x - c right) leq d
end{align*}$$



Case I - Diagonal Matrix



In this case the matrix $ W $ is diagonal with $ {w}_{ii} geq 0 $.



Let's assume we know how to solve this.



Remark

I could find a simple iterative method to find $ lambda $ for this case but not a closed form solution. Though I'd guess it is doable.



Case II - Positive Definite Matrix



In this case the matrix $ W $ a Positive Semi Definite (PSD) Matrix - $ W succeq 0 $.



Since $ W $ is a PSD matrix it can be written as (Eigen Decomposition):



$$ W = {P}^{T} D P $$



Where $ P $ is Unitary Matrix and $ D $ is Diagonal Matrix with $ {d}_{ii} geq 0 $.



Then one could rewrite the problem as:



$$begin{align*}
arg min_{x} quad & frac{1}{2} {left| x - y right|}_{2}^{2} & text{} \
text{subject to} quad & {left( x - c right)}^{T} {P}^{T} D P left( x - c right) leq d
end{align*}$$



Defining $ v = P x $, $ z = P y $ and $ e = P c $ one could rewrite the problem as:



$$begin{align*}
arg min_{x} quad & frac{1}{2} {left| x - y right|}_{2}^{2} & text{} \
text{subject to} quad & {left( v - e right)}^{T} D left( v - e right) leq d
end{align*}$$



Now, since $ P $ is Unitary Matrix which means it preserves the $ {L}_{2} $ norm and is invertible then $ frac{1}{2} {left| x - y right|}_{2}^{2} = frac{1}{2} {left| P left( x - y right) right|}_{2}^{2} = frac{1}{2} {left| v - z right|}_{2}^{2} $ and the whole problem becomes:



$$begin{align*}
arg min_{v} quad & frac{1}{2} {left| v - z right|}_{2}^{2} & text{} \
text{subject to} quad & {left( v - e right)}^{T} D left( v - e right) leq d
end{align*}$$



Where $ D $ is Diagonal Matrix which is exactly Case I we assume we know how to solve.



Code



I created a MATLAB code to solve the problem (The general).

The code is available at my StackExchange Mathematics Q3079400 GitHub Repository.






share|cite|improve this answer











$endgroup$













  • $begingroup$
    This looks good, Royi. I agree that we need to know if a closed form exists for the diagonal case. +1
    $endgroup$
    – user550103
    Feb 3 at 12:53










  • $begingroup$
    As I wrote, it is easy to solve with few iterations of Binary Search. The question is if that what you want? I hope @BrianBorchers have a closed form solution for that.
    $endgroup$
    – Royi
    Feb 3 at 16:04
















1












$begingroup$

The objective function is given by:



$$begin{align*}
arg min_{x} quad & frac{1}{2} {left| x - y right|}_{2}^{2} & text{} \
text{subject to} quad & {left( x - c right)}^{T} W left( x - c right) leq d
end{align*}$$



Case I - Diagonal Matrix



In this case the matrix $ W $ is diagonal with $ {w}_{ii} geq 0 $.



Let's assume we know how to solve this.



Remark

I could find a simple iterative method to find $ lambda $ for this case but not a closed form solution. Though I'd guess it is doable.



Case II - Positive Definite Matrix



In this case the matrix $ W $ a Positive Semi Definite (PSD) Matrix - $ W succeq 0 $.



Since $ W $ is a PSD matrix it can be written as (Eigen Decomposition):



$$ W = {P}^{T} D P $$



Where $ P $ is Unitary Matrix and $ D $ is Diagonal Matrix with $ {d}_{ii} geq 0 $.



Then one could rewrite the problem as:



$$begin{align*}
arg min_{x} quad & frac{1}{2} {left| x - y right|}_{2}^{2} & text{} \
text{subject to} quad & {left( x - c right)}^{T} {P}^{T} D P left( x - c right) leq d
end{align*}$$



Defining $ v = P x $, $ z = P y $ and $ e = P c $ one could rewrite the problem as:



$$begin{align*}
arg min_{x} quad & frac{1}{2} {left| x - y right|}_{2}^{2} & text{} \
text{subject to} quad & {left( v - e right)}^{T} D left( v - e right) leq d
end{align*}$$



Now, since $ P $ is Unitary Matrix which means it preserves the $ {L}_{2} $ norm and is invertible then $ frac{1}{2} {left| x - y right|}_{2}^{2} = frac{1}{2} {left| P left( x - y right) right|}_{2}^{2} = frac{1}{2} {left| v - z right|}_{2}^{2} $ and the whole problem becomes:



$$begin{align*}
arg min_{v} quad & frac{1}{2} {left| v - z right|}_{2}^{2} & text{} \
text{subject to} quad & {left( v - e right)}^{T} D left( v - e right) leq d
end{align*}$$



Where $ D $ is Diagonal Matrix which is exactly Case I we assume we know how to solve.



Code



I created a MATLAB code to solve the problem (The general).

The code is available at my StackExchange Mathematics Q3079400 GitHub Repository.






share|cite|improve this answer











$endgroup$













  • $begingroup$
    This looks good, Royi. I agree that we need to know if a closed form exists for the diagonal case. +1
    $endgroup$
    – user550103
    Feb 3 at 12:53










  • $begingroup$
    As I wrote, it is easy to solve with few iterations of Binary Search. The question is if that what you want? I hope @BrianBorchers have a closed form solution for that.
    $endgroup$
    – Royi
    Feb 3 at 16:04














1












1








1





$begingroup$

The objective function is given by:



$$begin{align*}
arg min_{x} quad & frac{1}{2} {left| x - y right|}_{2}^{2} & text{} \
text{subject to} quad & {left( x - c right)}^{T} W left( x - c right) leq d
end{align*}$$



Case I - Diagonal Matrix



In this case the matrix $ W $ is diagonal with $ {w}_{ii} geq 0 $.



Let's assume we know how to solve this.



Remark

I could find a simple iterative method to find $ lambda $ for this case but not a closed form solution. Though I'd guess it is doable.



Case II - Positive Definite Matrix



In this case the matrix $ W $ a Positive Semi Definite (PSD) Matrix - $ W succeq 0 $.



Since $ W $ is a PSD matrix it can be written as (Eigen Decomposition):



$$ W = {P}^{T} D P $$



Where $ P $ is Unitary Matrix and $ D $ is Diagonal Matrix with $ {d}_{ii} geq 0 $.



Then one could rewrite the problem as:



$$begin{align*}
arg min_{x} quad & frac{1}{2} {left| x - y right|}_{2}^{2} & text{} \
text{subject to} quad & {left( x - c right)}^{T} {P}^{T} D P left( x - c right) leq d
end{align*}$$



Defining $ v = P x $, $ z = P y $ and $ e = P c $ one could rewrite the problem as:



$$begin{align*}
arg min_{x} quad & frac{1}{2} {left| x - y right|}_{2}^{2} & text{} \
text{subject to} quad & {left( v - e right)}^{T} D left( v - e right) leq d
end{align*}$$



Now, since $ P $ is Unitary Matrix which means it preserves the $ {L}_{2} $ norm and is invertible then $ frac{1}{2} {left| x - y right|}_{2}^{2} = frac{1}{2} {left| P left( x - y right) right|}_{2}^{2} = frac{1}{2} {left| v - z right|}_{2}^{2} $ and the whole problem becomes:



$$begin{align*}
arg min_{v} quad & frac{1}{2} {left| v - z right|}_{2}^{2} & text{} \
text{subject to} quad & {left( v - e right)}^{T} D left( v - e right) leq d
end{align*}$$



Where $ D $ is Diagonal Matrix which is exactly Case I we assume we know how to solve.



Code



I created a MATLAB code to solve the problem (The general).

The code is available at my StackExchange Mathematics Q3079400 GitHub Repository.






share|cite|improve this answer











$endgroup$



The objective function is given by:



$$begin{align*}
arg min_{x} quad & frac{1}{2} {left| x - y right|}_{2}^{2} & text{} \
text{subject to} quad & {left( x - c right)}^{T} W left( x - c right) leq d
end{align*}$$



Case I - Diagonal Matrix



In this case the matrix $ W $ is diagonal with $ {w}_{ii} geq 0 $.



Let's assume we know how to solve this.



Remark

I could find a simple iterative method to find $ lambda $ for this case but not a closed form solution. Though I'd guess it is doable.



Case II - Positive Definite Matrix



In this case the matrix $ W $ a Positive Semi Definite (PSD) Matrix - $ W succeq 0 $.



Since $ W $ is a PSD matrix it can be written as (Eigen Decomposition):



$$ W = {P}^{T} D P $$



Where $ P $ is Unitary Matrix and $ D $ is Diagonal Matrix with $ {d}_{ii} geq 0 $.



Then one could rewrite the problem as:



$$begin{align*}
arg min_{x} quad & frac{1}{2} {left| x - y right|}_{2}^{2} & text{} \
text{subject to} quad & {left( x - c right)}^{T} {P}^{T} D P left( x - c right) leq d
end{align*}$$



Defining $ v = P x $, $ z = P y $ and $ e = P c $ one could rewrite the problem as:



$$begin{align*}
arg min_{x} quad & frac{1}{2} {left| x - y right|}_{2}^{2} & text{} \
text{subject to} quad & {left( v - e right)}^{T} D left( v - e right) leq d
end{align*}$$



Now, since $ P $ is Unitary Matrix which means it preserves the $ {L}_{2} $ norm and is invertible then $ frac{1}{2} {left| x - y right|}_{2}^{2} = frac{1}{2} {left| P left( x - y right) right|}_{2}^{2} = frac{1}{2} {left| v - z right|}_{2}^{2} $ and the whole problem becomes:



$$begin{align*}
arg min_{v} quad & frac{1}{2} {left| v - z right|}_{2}^{2} & text{} \
text{subject to} quad & {left( v - e right)}^{T} D left( v - e right) leq d
end{align*}$$



Where $ D $ is Diagonal Matrix which is exactly Case I we assume we know how to solve.



Code



I created a MATLAB code to solve the problem (The general).

The code is available at my StackExchange Mathematics Q3079400 GitHub Repository.







share|cite|improve this answer














share|cite|improve this answer



share|cite|improve this answer








edited Feb 3 at 21:04

























answered Feb 3 at 5:47









RoyiRoyi

3,49012352




3,49012352












  • $begingroup$
    This looks good, Royi. I agree that we need to know if a closed form exists for the diagonal case. +1
    $endgroup$
    – user550103
    Feb 3 at 12:53










  • $begingroup$
    As I wrote, it is easy to solve with few iterations of Binary Search. The question is if that what you want? I hope @BrianBorchers have a closed form solution for that.
    $endgroup$
    – Royi
    Feb 3 at 16:04


















  • $begingroup$
    This looks good, Royi. I agree that we need to know if a closed form exists for the diagonal case. +1
    $endgroup$
    – user550103
    Feb 3 at 12:53










  • $begingroup$
    As I wrote, it is easy to solve with few iterations of Binary Search. The question is if that what you want? I hope @BrianBorchers have a closed form solution for that.
    $endgroup$
    – Royi
    Feb 3 at 16:04
















$begingroup$
This looks good, Royi. I agree that we need to know if a closed form exists for the diagonal case. +1
$endgroup$
– user550103
Feb 3 at 12:53




$begingroup$
This looks good, Royi. I agree that we need to know if a closed form exists for the diagonal case. +1
$endgroup$
– user550103
Feb 3 at 12:53












$begingroup$
As I wrote, it is easy to solve with few iterations of Binary Search. The question is if that what you want? I hope @BrianBorchers have a closed form solution for that.
$endgroup$
– Royi
Feb 3 at 16:04




$begingroup$
As I wrote, it is easy to solve with few iterations of Binary Search. The question is if that what you want? I hope @BrianBorchers have a closed form solution for that.
$endgroup$
– Royi
Feb 3 at 16:04


















draft saved

draft discarded




















































Thanks for contributing an answer to Mathematics Stack Exchange!


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


Use MathJax to format equations. MathJax reference.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3079400%2forthogonal-projection-onto-the-weighted-l-2-norm-ball%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

Mario Kart Wii

What does “Dominus providebit” mean?

Antonio Litta Visconti Arese