Does every nonzero polynomial take a nonzero value at one of its multi-indices?
$begingroup$
A polynomial $p$ can be specified by its coefficient function, a finitely supported function $c:mathbb N^d_0tomathbb R.$ Here $mathbb N_0={0,1,2,dots}$ and $dinmathbb N_0.$ The value of $p$ at a point $xinmathbb R^d$ is $p(x)=sum_{alphainmathbb N^d_0}c(alpha)x_1^{alpha_1}dots x_d^{alpha_d}$ (the sum makes sense by the assumption that $c$ has finite support). We call $p$ non-zero if $c(alpha)neq 0$ for some $alpha.$
For all non-zero $p$ does there exist $alpha$ such that $c(alpha)neq 0$ and $p(alpha)neq 0$?
Equivalently: for all finite $Asubsetmathbb N_0^d,$ is the $|A|times |A|$ matrix defined by $M_{alpha,beta}=(alpha_1^{beta_1}dotsalpha_d^{beta_d})$ non-singular? (In one direction. take $A$ to be the support of a counterexample $c,$ which is then in the kernel of $M.$ In the other direction, take $c$ to be a vector in the kernel of a counterexample $M.$) Call $A$ "good" if this holds. I have checked some randomly generated sets $A$ are good. Also:
If $Asubset mathbb N_0^d$ and $Bsubset mathbb N_0^e$ are both non-empty and good then the Cartesian product $Atimes Bsubsetmathbb N_0^{d+e}$ is good. In terms of matrices this is because the Kronecker product of two positive-dimensional square matrices is non-singular iff the two matrices are non-singular.
Let $Asubset mathbb N_0^{d+1}.$ If the sets defined by $A_0={alphain Amid alpha_{d+1}=0}$ and $A_+={alphain Amid alpha_{d+1}>0}$ are good then $A$ is good. Proof: assume $A_0$ and $A_+$ are good and consider a polynomial $p$ with coefficients $c$ zero outside $A.$ If $p(alpha)=0$ for $alphain A_0$ then $c(alpha)=0$ for all $alphain A_0,$ because $A_0$ is good and the $A_+$ coefficients don't contribute to $p(x)$ when $x_{d+1}=0.$ So $c$ is zero outside $A_+,$ and hence $p$ must also be zero on $A_+$ because $A_+$ is good.
If $Asubset mathbb N_0^1$ then $A$ is good. Proof: by the last point we can assume $0notin A.$ By Descartes' rule of signs a univariate polynomial with at most $|A|$ non-zero coefficients has at most $|A|-1$ positive zeroes.
$Asubset mathbb N_0^d$ is good if it is downwards-closed, i.e. for all $betain A$ and all $alpha$ such that $alpha_ileqbeta_i$ for all $1leq ileq d$ we have $alphain A.$ Proof: apply the forward difference operator $(Delta p)(x)=p(x_1,dots,x_{d-1},x_d+1)-p(x).$ By induction $A'={alpha in Amid (alpha_1,dots,alpha_d+1)in A}$ is good. If $p$ had zero coefficients outside $A$ and also vanished on $A,$ then $Delta p$ would have zero coefficients outside $A'$ and vanish on $A',$ which forces $Delta p$ to be the zero polynomial. This means $p$ has zero coefficients outside $A_0$ (as defined in the last point) and we can apply induction on dimension.
A small variation: if $Asubset mathbb N_0^d$ is downwards closed and $alphainmathbb N_0^d$ then the shifted set $alpha+A={alpha+betamid betain A}$ is good. This follows from the same argument but using the modified forwards difference operator defined by $Delta' p=x^alpha Delta x^{-alpha} p.$
R. Zippel's "Interpolating polynomials from their values" calls similar questions "zero avoidance problems", but I couldn't find anything answering this question.
matrices algebraic-geometry polynomials multivariate-polynomial
$endgroup$
add a comment |
$begingroup$
A polynomial $p$ can be specified by its coefficient function, a finitely supported function $c:mathbb N^d_0tomathbb R.$ Here $mathbb N_0={0,1,2,dots}$ and $dinmathbb N_0.$ The value of $p$ at a point $xinmathbb R^d$ is $p(x)=sum_{alphainmathbb N^d_0}c(alpha)x_1^{alpha_1}dots x_d^{alpha_d}$ (the sum makes sense by the assumption that $c$ has finite support). We call $p$ non-zero if $c(alpha)neq 0$ for some $alpha.$
For all non-zero $p$ does there exist $alpha$ such that $c(alpha)neq 0$ and $p(alpha)neq 0$?
Equivalently: for all finite $Asubsetmathbb N_0^d,$ is the $|A|times |A|$ matrix defined by $M_{alpha,beta}=(alpha_1^{beta_1}dotsalpha_d^{beta_d})$ non-singular? (In one direction. take $A$ to be the support of a counterexample $c,$ which is then in the kernel of $M.$ In the other direction, take $c$ to be a vector in the kernel of a counterexample $M.$) Call $A$ "good" if this holds. I have checked some randomly generated sets $A$ are good. Also:
If $Asubset mathbb N_0^d$ and $Bsubset mathbb N_0^e$ are both non-empty and good then the Cartesian product $Atimes Bsubsetmathbb N_0^{d+e}$ is good. In terms of matrices this is because the Kronecker product of two positive-dimensional square matrices is non-singular iff the two matrices are non-singular.
Let $Asubset mathbb N_0^{d+1}.$ If the sets defined by $A_0={alphain Amid alpha_{d+1}=0}$ and $A_+={alphain Amid alpha_{d+1}>0}$ are good then $A$ is good. Proof: assume $A_0$ and $A_+$ are good and consider a polynomial $p$ with coefficients $c$ zero outside $A.$ If $p(alpha)=0$ for $alphain A_0$ then $c(alpha)=0$ for all $alphain A_0,$ because $A_0$ is good and the $A_+$ coefficients don't contribute to $p(x)$ when $x_{d+1}=0.$ So $c$ is zero outside $A_+,$ and hence $p$ must also be zero on $A_+$ because $A_+$ is good.
If $Asubset mathbb N_0^1$ then $A$ is good. Proof: by the last point we can assume $0notin A.$ By Descartes' rule of signs a univariate polynomial with at most $|A|$ non-zero coefficients has at most $|A|-1$ positive zeroes.
$Asubset mathbb N_0^d$ is good if it is downwards-closed, i.e. for all $betain A$ and all $alpha$ such that $alpha_ileqbeta_i$ for all $1leq ileq d$ we have $alphain A.$ Proof: apply the forward difference operator $(Delta p)(x)=p(x_1,dots,x_{d-1},x_d+1)-p(x).$ By induction $A'={alpha in Amid (alpha_1,dots,alpha_d+1)in A}$ is good. If $p$ had zero coefficients outside $A$ and also vanished on $A,$ then $Delta p$ would have zero coefficients outside $A'$ and vanish on $A',$ which forces $Delta p$ to be the zero polynomial. This means $p$ has zero coefficients outside $A_0$ (as defined in the last point) and we can apply induction on dimension.
A small variation: if $Asubset mathbb N_0^d$ is downwards closed and $alphainmathbb N_0^d$ then the shifted set $alpha+A={alpha+betamid betain A}$ is good. This follows from the same argument but using the modified forwards difference operator defined by $Delta' p=x^alpha Delta x^{-alpha} p.$
R. Zippel's "Interpolating polynomials from their values" calls similar questions "zero avoidance problems", but I couldn't find anything answering this question.
matrices algebraic-geometry polynomials multivariate-polynomial
$endgroup$
$begingroup$
I think I can show this is indeed true for the univariate case. But I think you have this already in your third item
$endgroup$
– quantum
Jan 22 at 13:30
add a comment |
$begingroup$
A polynomial $p$ can be specified by its coefficient function, a finitely supported function $c:mathbb N^d_0tomathbb R.$ Here $mathbb N_0={0,1,2,dots}$ and $dinmathbb N_0.$ The value of $p$ at a point $xinmathbb R^d$ is $p(x)=sum_{alphainmathbb N^d_0}c(alpha)x_1^{alpha_1}dots x_d^{alpha_d}$ (the sum makes sense by the assumption that $c$ has finite support). We call $p$ non-zero if $c(alpha)neq 0$ for some $alpha.$
For all non-zero $p$ does there exist $alpha$ such that $c(alpha)neq 0$ and $p(alpha)neq 0$?
Equivalently: for all finite $Asubsetmathbb N_0^d,$ is the $|A|times |A|$ matrix defined by $M_{alpha,beta}=(alpha_1^{beta_1}dotsalpha_d^{beta_d})$ non-singular? (In one direction. take $A$ to be the support of a counterexample $c,$ which is then in the kernel of $M.$ In the other direction, take $c$ to be a vector in the kernel of a counterexample $M.$) Call $A$ "good" if this holds. I have checked some randomly generated sets $A$ are good. Also:
If $Asubset mathbb N_0^d$ and $Bsubset mathbb N_0^e$ are both non-empty and good then the Cartesian product $Atimes Bsubsetmathbb N_0^{d+e}$ is good. In terms of matrices this is because the Kronecker product of two positive-dimensional square matrices is non-singular iff the two matrices are non-singular.
Let $Asubset mathbb N_0^{d+1}.$ If the sets defined by $A_0={alphain Amid alpha_{d+1}=0}$ and $A_+={alphain Amid alpha_{d+1}>0}$ are good then $A$ is good. Proof: assume $A_0$ and $A_+$ are good and consider a polynomial $p$ with coefficients $c$ zero outside $A.$ If $p(alpha)=0$ for $alphain A_0$ then $c(alpha)=0$ for all $alphain A_0,$ because $A_0$ is good and the $A_+$ coefficients don't contribute to $p(x)$ when $x_{d+1}=0.$ So $c$ is zero outside $A_+,$ and hence $p$ must also be zero on $A_+$ because $A_+$ is good.
If $Asubset mathbb N_0^1$ then $A$ is good. Proof: by the last point we can assume $0notin A.$ By Descartes' rule of signs a univariate polynomial with at most $|A|$ non-zero coefficients has at most $|A|-1$ positive zeroes.
$Asubset mathbb N_0^d$ is good if it is downwards-closed, i.e. for all $betain A$ and all $alpha$ such that $alpha_ileqbeta_i$ for all $1leq ileq d$ we have $alphain A.$ Proof: apply the forward difference operator $(Delta p)(x)=p(x_1,dots,x_{d-1},x_d+1)-p(x).$ By induction $A'={alpha in Amid (alpha_1,dots,alpha_d+1)in A}$ is good. If $p$ had zero coefficients outside $A$ and also vanished on $A,$ then $Delta p$ would have zero coefficients outside $A'$ and vanish on $A',$ which forces $Delta p$ to be the zero polynomial. This means $p$ has zero coefficients outside $A_0$ (as defined in the last point) and we can apply induction on dimension.
A small variation: if $Asubset mathbb N_0^d$ is downwards closed and $alphainmathbb N_0^d$ then the shifted set $alpha+A={alpha+betamid betain A}$ is good. This follows from the same argument but using the modified forwards difference operator defined by $Delta' p=x^alpha Delta x^{-alpha} p.$
R. Zippel's "Interpolating polynomials from their values" calls similar questions "zero avoidance problems", but I couldn't find anything answering this question.
matrices algebraic-geometry polynomials multivariate-polynomial
$endgroup$
A polynomial $p$ can be specified by its coefficient function, a finitely supported function $c:mathbb N^d_0tomathbb R.$ Here $mathbb N_0={0,1,2,dots}$ and $dinmathbb N_0.$ The value of $p$ at a point $xinmathbb R^d$ is $p(x)=sum_{alphainmathbb N^d_0}c(alpha)x_1^{alpha_1}dots x_d^{alpha_d}$ (the sum makes sense by the assumption that $c$ has finite support). We call $p$ non-zero if $c(alpha)neq 0$ for some $alpha.$
For all non-zero $p$ does there exist $alpha$ such that $c(alpha)neq 0$ and $p(alpha)neq 0$?
Equivalently: for all finite $Asubsetmathbb N_0^d,$ is the $|A|times |A|$ matrix defined by $M_{alpha,beta}=(alpha_1^{beta_1}dotsalpha_d^{beta_d})$ non-singular? (In one direction. take $A$ to be the support of a counterexample $c,$ which is then in the kernel of $M.$ In the other direction, take $c$ to be a vector in the kernel of a counterexample $M.$) Call $A$ "good" if this holds. I have checked some randomly generated sets $A$ are good. Also:
If $Asubset mathbb N_0^d$ and $Bsubset mathbb N_0^e$ are both non-empty and good then the Cartesian product $Atimes Bsubsetmathbb N_0^{d+e}$ is good. In terms of matrices this is because the Kronecker product of two positive-dimensional square matrices is non-singular iff the two matrices are non-singular.
Let $Asubset mathbb N_0^{d+1}.$ If the sets defined by $A_0={alphain Amid alpha_{d+1}=0}$ and $A_+={alphain Amid alpha_{d+1}>0}$ are good then $A$ is good. Proof: assume $A_0$ and $A_+$ are good and consider a polynomial $p$ with coefficients $c$ zero outside $A.$ If $p(alpha)=0$ for $alphain A_0$ then $c(alpha)=0$ for all $alphain A_0,$ because $A_0$ is good and the $A_+$ coefficients don't contribute to $p(x)$ when $x_{d+1}=0.$ So $c$ is zero outside $A_+,$ and hence $p$ must also be zero on $A_+$ because $A_+$ is good.
If $Asubset mathbb N_0^1$ then $A$ is good. Proof: by the last point we can assume $0notin A.$ By Descartes' rule of signs a univariate polynomial with at most $|A|$ non-zero coefficients has at most $|A|-1$ positive zeroes.
$Asubset mathbb N_0^d$ is good if it is downwards-closed, i.e. for all $betain A$ and all $alpha$ such that $alpha_ileqbeta_i$ for all $1leq ileq d$ we have $alphain A.$ Proof: apply the forward difference operator $(Delta p)(x)=p(x_1,dots,x_{d-1},x_d+1)-p(x).$ By induction $A'={alpha in Amid (alpha_1,dots,alpha_d+1)in A}$ is good. If $p$ had zero coefficients outside $A$ and also vanished on $A,$ then $Delta p$ would have zero coefficients outside $A'$ and vanish on $A',$ which forces $Delta p$ to be the zero polynomial. This means $p$ has zero coefficients outside $A_0$ (as defined in the last point) and we can apply induction on dimension.
A small variation: if $Asubset mathbb N_0^d$ is downwards closed and $alphainmathbb N_0^d$ then the shifted set $alpha+A={alpha+betamid betain A}$ is good. This follows from the same argument but using the modified forwards difference operator defined by $Delta' p=x^alpha Delta x^{-alpha} p.$
R. Zippel's "Interpolating polynomials from their values" calls similar questions "zero avoidance problems", but I couldn't find anything answering this question.
matrices algebraic-geometry polynomials multivariate-polynomial
matrices algebraic-geometry polynomials multivariate-polynomial
asked Jan 21 at 23:33
DapDap
17.5k841
17.5k841
$begingroup$
I think I can show this is indeed true for the univariate case. But I think you have this already in your third item
$endgroup$
– quantum
Jan 22 at 13:30
add a comment |
$begingroup$
I think I can show this is indeed true for the univariate case. But I think you have this already in your third item
$endgroup$
– quantum
Jan 22 at 13:30
$begingroup$
I think I can show this is indeed true for the univariate case. But I think you have this already in your third item
$endgroup$
– quantum
Jan 22 at 13:30
$begingroup$
I think I can show this is indeed true for the univariate case. But I think you have this already in your third item
$endgroup$
– quantum
Jan 22 at 13:30
add a comment |
0
active
oldest
votes
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3082561%2fdoes-every-nonzero-polynomial-take-a-nonzero-value-at-one-of-its-multi-indices%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
0
active
oldest
votes
0
active
oldest
votes
active
oldest
votes
active
oldest
votes
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3082561%2fdoes-every-nonzero-polynomial-take-a-nonzero-value-at-one-of-its-multi-indices%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
$begingroup$
I think I can show this is indeed true for the univariate case. But I think you have this already in your third item
$endgroup$
– quantum
Jan 22 at 13:30