Quantifying how crowded points are in $d$ dimensional cube












7














Let $x_1, cdots, x_n$ be distinct points in the $d$ dimensional cube $[-1,1]^d$. What is a lower bound on the quantity:
$$ sum_{1 le j < k le n} frac{1}{||x_k-x_j||}$$
where $|| cdot ||$ is the Euclidean norm.



An asymptotic answer for large $n$ is also fine and bounds on any quantity that looks similar, such as replacing the norm with norm squared, is also welcomed.



My motivation is a problem that appeared in the book The Cauchy-Schwarz Master Class which proved the following:




If $-1 le x_1 < x_2 < cdots < x_n le 1$ then $$ sum_{1 le j < k le n} frac{1}{x_k-x_j} ge frac{n^2 log n}8.$$











share|cite|improve this question
























  • As it stands your question is a bit too broad, because it is simultaneously asking for the best bound, an asymptotic bound, and "bounds on any quantity that looks similar". Could you please edit your question to make it more focused?
    – Lord_Farin
    Dec 24 '18 at 9:13










  • Ok, I edited a little. Since I dont know the difficulty of this question its bound to be a little vague but the whole point is to simulate some discussion and get any useful bounds.
    – Sandeep Silwal
    Dec 24 '18 at 16:25










  • These kind of "simple" questions very quickly get out of hand...
    – Fabian
    Jan 1 at 4:35
















7














Let $x_1, cdots, x_n$ be distinct points in the $d$ dimensional cube $[-1,1]^d$. What is a lower bound on the quantity:
$$ sum_{1 le j < k le n} frac{1}{||x_k-x_j||}$$
where $|| cdot ||$ is the Euclidean norm.



An asymptotic answer for large $n$ is also fine and bounds on any quantity that looks similar, such as replacing the norm with norm squared, is also welcomed.



My motivation is a problem that appeared in the book The Cauchy-Schwarz Master Class which proved the following:




If $-1 le x_1 < x_2 < cdots < x_n le 1$ then $$ sum_{1 le j < k le n} frac{1}{x_k-x_j} ge frac{n^2 log n}8.$$











share|cite|improve this question
























  • As it stands your question is a bit too broad, because it is simultaneously asking for the best bound, an asymptotic bound, and "bounds on any quantity that looks similar". Could you please edit your question to make it more focused?
    – Lord_Farin
    Dec 24 '18 at 9:13










  • Ok, I edited a little. Since I dont know the difficulty of this question its bound to be a little vague but the whole point is to simulate some discussion and get any useful bounds.
    – Sandeep Silwal
    Dec 24 '18 at 16:25










  • These kind of "simple" questions very quickly get out of hand...
    – Fabian
    Jan 1 at 4:35














7












7








7


1





Let $x_1, cdots, x_n$ be distinct points in the $d$ dimensional cube $[-1,1]^d$. What is a lower bound on the quantity:
$$ sum_{1 le j < k le n} frac{1}{||x_k-x_j||}$$
where $|| cdot ||$ is the Euclidean norm.



An asymptotic answer for large $n$ is also fine and bounds on any quantity that looks similar, such as replacing the norm with norm squared, is also welcomed.



My motivation is a problem that appeared in the book The Cauchy-Schwarz Master Class which proved the following:




If $-1 le x_1 < x_2 < cdots < x_n le 1$ then $$ sum_{1 le j < k le n} frac{1}{x_k-x_j} ge frac{n^2 log n}8.$$











share|cite|improve this question















Let $x_1, cdots, x_n$ be distinct points in the $d$ dimensional cube $[-1,1]^d$. What is a lower bound on the quantity:
$$ sum_{1 le j < k le n} frac{1}{||x_k-x_j||}$$
where $|| cdot ||$ is the Euclidean norm.



An asymptotic answer for large $n$ is also fine and bounds on any quantity that looks similar, such as replacing the norm with norm squared, is also welcomed.



My motivation is a problem that appeared in the book The Cauchy-Schwarz Master Class which proved the following:




If $-1 le x_1 < x_2 < cdots < x_n le 1$ then $$ sum_{1 le j < k le n} frac{1}{x_k-x_j} ge frac{n^2 log n}8.$$








geometry inequality upper-lower-bounds






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Dec 24 '18 at 16:25

























asked Dec 24 '18 at 1:47









Sandeep Silwal

5,84311236




5,84311236












  • As it stands your question is a bit too broad, because it is simultaneously asking for the best bound, an asymptotic bound, and "bounds on any quantity that looks similar". Could you please edit your question to make it more focused?
    – Lord_Farin
    Dec 24 '18 at 9:13










  • Ok, I edited a little. Since I dont know the difficulty of this question its bound to be a little vague but the whole point is to simulate some discussion and get any useful bounds.
    – Sandeep Silwal
    Dec 24 '18 at 16:25










  • These kind of "simple" questions very quickly get out of hand...
    – Fabian
    Jan 1 at 4:35


















  • As it stands your question is a bit too broad, because it is simultaneously asking for the best bound, an asymptotic bound, and "bounds on any quantity that looks similar". Could you please edit your question to make it more focused?
    – Lord_Farin
    Dec 24 '18 at 9:13










  • Ok, I edited a little. Since I dont know the difficulty of this question its bound to be a little vague but the whole point is to simulate some discussion and get any useful bounds.
    – Sandeep Silwal
    Dec 24 '18 at 16:25










  • These kind of "simple" questions very quickly get out of hand...
    – Fabian
    Jan 1 at 4:35
















As it stands your question is a bit too broad, because it is simultaneously asking for the best bound, an asymptotic bound, and "bounds on any quantity that looks similar". Could you please edit your question to make it more focused?
– Lord_Farin
Dec 24 '18 at 9:13




As it stands your question is a bit too broad, because it is simultaneously asking for the best bound, an asymptotic bound, and "bounds on any quantity that looks similar". Could you please edit your question to make it more focused?
– Lord_Farin
Dec 24 '18 at 9:13












Ok, I edited a little. Since I dont know the difficulty of this question its bound to be a little vague but the whole point is to simulate some discussion and get any useful bounds.
– Sandeep Silwal
Dec 24 '18 at 16:25




Ok, I edited a little. Since I dont know the difficulty of this question its bound to be a little vague but the whole point is to simulate some discussion and get any useful bounds.
– Sandeep Silwal
Dec 24 '18 at 16:25












These kind of "simple" questions very quickly get out of hand...
– Fabian
Jan 1 at 4:35




These kind of "simple" questions very quickly get out of hand...
– Fabian
Jan 1 at 4:35










2 Answers
2






active

oldest

votes


















2





+50









This is an answer only up to $n = exp(ad)$ for some $a in theta(1)$, but may be enough to get started: There are as many as $n = exp(ad)$ points $y_1,ldots, y_n$ in ${-1,1}^d$ that satisfy the following:
$||y_i-y_j||_1 in theta(d)$ for each $i not = j$, where $|| cdot ||_1$ denotes the Manhattan metric.
[Google error-correcting codes]



Then for each $i not = j$, the following holds:



$$frac{1}{||y_i-y_j||_2} leq frac{1}{csqrt{d}}$$



for some $c in Omega(1)$, which implies the following:




Inequality 1: $$sum_{1 le i < j le n} frac{1}{||y_i-y_j||_2} leq frac{n^2}{2csqrt{d}}$$




However. not that for any distinct $x,y$, the following holds:




Inequality 2: $$frac{1}{||x-y||_2} ge frac{1}{sqrt{d}} $$




So Inequalities 1 and 2 together imply that for $y_1,ldots, y_n$ as above,



$sum_{1 le i < j le n} frac{1}{||y_i-y_j||_2}$ is no more than a constant multiple larger than $sum_{1 le i < j le n} frac{1}{||x_i-x_j||_2}$ for any other $x_1,ldots, x_n in [-1,1]^d$.






share|cite|improve this answer





























    1














    (Update below)



    The situation seems to be less interesting asymptotically for fixed $d geq 2$. We have
    $$frac{1}{||x-y||_2} geq frac{1}{2sqrt{d}}$$ for any $x, y in [-1, 1]^d$, so there's an easy lower bound of
    $$sum_{1 leq j < k leq n} frac{1}{||x_k - x_j||} geq frac{1}{2sqrt{d}} binom{n}{2} = Omega(n^2).$$ This lower bound is actually achieved (up to a constant) by letting the $x_j$ lie on a grid, say $Lambda^+ = {n^{-1/d}, 2n^{-1/d}, dots, 1}^d$.



    First note that if we set
    $$Lambda = {-1, dots, -2n^{-1/d}, -n^{-1/d}, 0, n^{-1/d}, 2n^{-1/d}, dots, 1}^d$$
    we have
    $$sum_{j neq k} frac{1}{||x_k - x_j||} < sum_{v in Lambda, v neq 0} frac{1}{||v||}$$
    for each $k$.



    Also, since $d geq 2$, the integral
    $$int_{[-1, 1]^d} frac{dx}{||x||}$$
    actually converges, say to some $I_d < infty$, meaning in particular that
    $$frac{1}{n} sum_{v in Lambda, v neq 0} frac{1}{||v||} to I_d$$
    (this is just a standard grid approximation).



    Putting these together, we get
    $$sum_{1 leq j < k leq n} frac{1}{||x_k - x_j||} = frac{1}{2} sum_k sum_{j neq k} frac{1}{||x_k - x_j||} < frac{n}{2} sum_{v in Lambda, v neq 0} frac{1}{||v||} sim frac{n^2}{2} I_d$$
    so our sum is $O(n^2)$, matching the lower bound.



    Update:



    It's possible to prove more. Fix some positive integer $d$ and real $p > 0$. Then we can show that for $x_1, dots, x_n in [-1, 1]^d$, the following tight asymptotic bounds hold: when $p < d$, we have
    $$ sum_{1 leq j < k leq n} frac{1}{||x_k - x_j||^p} geq Omega(n^2)$$
    and when $p > d$,
    $$ sum_{1 leq j < k leq n} frac{1}{||x_k - x_j||^p} geq Omega(n^{1 + frac{p}{d}})$$
    and both of these bounds are achieved by taking ${x_1, dots, x_n} = Lambda^+$. So far I haven't been able to find a tight lower bound for the case $p = d$.



    The proof for the case $p < d$ is the same as the proof above, using the fact that the integral
    $$int_{[-1, 1]^d} frac{dx}{||x||^p}$$
    converges.



    To prove the lower bound for $p > d$: surround each $x_k$ with a cube $C_k = x_k + [-5n^{-1/d}, 5n^{-1/d}]^d$ of side length $10n^{-1/d}$. If at least $n/2$ of these cubes $C_k$ were disjoint from all other $C_j$, then these $n/2$ cubes, all disjoint, would have a total volume of $10^d/2$, but would all be contained in $[-1-5n^{-1/d}, 1+5n^{-1/d}]$, which has volume at most $4^d$ for sufficiently large $n$, a contradiction. This means we have at least $n/2$ cubes, each of which intersects some other cube, so we get at least $n/2$ ordered pairs $(x_k, x_j)$, each with $||x_k - x_j|| < 10 n^{-1/d} sqrt{d}$, hence at least $n/4$ such pairs with $j < k$, so
    $$sum_{1 leq j < k leq n} frac{1}{||x_k - x_j||^p} geq frac{n}{4}left(frac{n^{p/d}}{10^p d^{p/2}}right) = Omega(n^{1 + p/d}).$$



    To evaluate at ${x_1, dots, x_n} = Lambda^+$, note that for any $k$,
    $$sum_{1 leq j < k leq n} frac{1}{||x_k - x_j||^p} < n^{p/d} cdot sum_{v in mathbb{Z}^d, v neq 0} frac{1}{||v||^p}$$
    but the sum on the RHS converges to some $S_{d, p} < infty$, since the integral
    $$int_{x in mathbb{R}^d, ||x|| geq 1} frac{dx}{||x||^p}$$
    converges, so
    $$sum_{1 leq j < k leq n} frac{1}{||x_k - x_j||^p} = frac{1}{2} sum_k sum_{j neq k} < frac{n}{2} cdot n^{p/d} S_{d, p} = O(n^{1 + p/d}).$$



    In the case $p = d$, evaluating the sum on $Lambda^+$ gives a result of $Theta(n^2 log n)$, but I don't know how to show this is optimal for $d > 1$.






    share|cite|improve this answer























      Your Answer





      StackExchange.ifUsing("editor", function () {
      return StackExchange.using("mathjaxEditing", function () {
      StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
      StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
      });
      });
      }, "mathjax-editing");

      StackExchange.ready(function() {
      var channelOptions = {
      tags: "".split(" "),
      id: "69"
      };
      initTagRenderer("".split(" "), "".split(" "), channelOptions);

      StackExchange.using("externalEditor", function() {
      // Have to fire editor after snippets, if snippets enabled
      if (StackExchange.settings.snippets.snippetsEnabled) {
      StackExchange.using("snippets", function() {
      createEditor();
      });
      }
      else {
      createEditor();
      }
      });

      function createEditor() {
      StackExchange.prepareEditor({
      heartbeatType: 'answer',
      autoActivateHeartbeat: false,
      convertImagesToLinks: true,
      noModals: true,
      showLowRepImageUploadWarning: true,
      reputationToPostImages: 10,
      bindNavPrevention: true,
      postfix: "",
      imageUploader: {
      brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
      contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
      allowUrls: true
      },
      noCode: true, onDemand: true,
      discardSelector: ".discard-answer"
      ,immediatelyShowMarkdownHelp:true
      });


      }
      });














      draft saved

      draft discarded


















      StackExchange.ready(
      function () {
      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3050869%2fquantifying-how-crowded-points-are-in-d-dimensional-cube%23new-answer', 'question_page');
      }
      );

      Post as a guest















      Required, but never shown

























      2 Answers
      2






      active

      oldest

      votes








      2 Answers
      2






      active

      oldest

      votes









      active

      oldest

      votes






      active

      oldest

      votes









      2





      +50









      This is an answer only up to $n = exp(ad)$ for some $a in theta(1)$, but may be enough to get started: There are as many as $n = exp(ad)$ points $y_1,ldots, y_n$ in ${-1,1}^d$ that satisfy the following:
      $||y_i-y_j||_1 in theta(d)$ for each $i not = j$, where $|| cdot ||_1$ denotes the Manhattan metric.
      [Google error-correcting codes]



      Then for each $i not = j$, the following holds:



      $$frac{1}{||y_i-y_j||_2} leq frac{1}{csqrt{d}}$$



      for some $c in Omega(1)$, which implies the following:




      Inequality 1: $$sum_{1 le i < j le n} frac{1}{||y_i-y_j||_2} leq frac{n^2}{2csqrt{d}}$$




      However. not that for any distinct $x,y$, the following holds:




      Inequality 2: $$frac{1}{||x-y||_2} ge frac{1}{sqrt{d}} $$




      So Inequalities 1 and 2 together imply that for $y_1,ldots, y_n$ as above,



      $sum_{1 le i < j le n} frac{1}{||y_i-y_j||_2}$ is no more than a constant multiple larger than $sum_{1 le i < j le n} frac{1}{||x_i-x_j||_2}$ for any other $x_1,ldots, x_n in [-1,1]^d$.






      share|cite|improve this answer


























        2





        +50









        This is an answer only up to $n = exp(ad)$ for some $a in theta(1)$, but may be enough to get started: There are as many as $n = exp(ad)$ points $y_1,ldots, y_n$ in ${-1,1}^d$ that satisfy the following:
        $||y_i-y_j||_1 in theta(d)$ for each $i not = j$, where $|| cdot ||_1$ denotes the Manhattan metric.
        [Google error-correcting codes]



        Then for each $i not = j$, the following holds:



        $$frac{1}{||y_i-y_j||_2} leq frac{1}{csqrt{d}}$$



        for some $c in Omega(1)$, which implies the following:




        Inequality 1: $$sum_{1 le i < j le n} frac{1}{||y_i-y_j||_2} leq frac{n^2}{2csqrt{d}}$$




        However. not that for any distinct $x,y$, the following holds:




        Inequality 2: $$frac{1}{||x-y||_2} ge frac{1}{sqrt{d}} $$




        So Inequalities 1 and 2 together imply that for $y_1,ldots, y_n$ as above,



        $sum_{1 le i < j le n} frac{1}{||y_i-y_j||_2}$ is no more than a constant multiple larger than $sum_{1 le i < j le n} frac{1}{||x_i-x_j||_2}$ for any other $x_1,ldots, x_n in [-1,1]^d$.






        share|cite|improve this answer
























          2





          +50







          2





          +50



          2




          +50




          This is an answer only up to $n = exp(ad)$ for some $a in theta(1)$, but may be enough to get started: There are as many as $n = exp(ad)$ points $y_1,ldots, y_n$ in ${-1,1}^d$ that satisfy the following:
          $||y_i-y_j||_1 in theta(d)$ for each $i not = j$, where $|| cdot ||_1$ denotes the Manhattan metric.
          [Google error-correcting codes]



          Then for each $i not = j$, the following holds:



          $$frac{1}{||y_i-y_j||_2} leq frac{1}{csqrt{d}}$$



          for some $c in Omega(1)$, which implies the following:




          Inequality 1: $$sum_{1 le i < j le n} frac{1}{||y_i-y_j||_2} leq frac{n^2}{2csqrt{d}}$$




          However. not that for any distinct $x,y$, the following holds:




          Inequality 2: $$frac{1}{||x-y||_2} ge frac{1}{sqrt{d}} $$




          So Inequalities 1 and 2 together imply that for $y_1,ldots, y_n$ as above,



          $sum_{1 le i < j le n} frac{1}{||y_i-y_j||_2}$ is no more than a constant multiple larger than $sum_{1 le i < j le n} frac{1}{||x_i-x_j||_2}$ for any other $x_1,ldots, x_n in [-1,1]^d$.






          share|cite|improve this answer












          This is an answer only up to $n = exp(ad)$ for some $a in theta(1)$, but may be enough to get started: There are as many as $n = exp(ad)$ points $y_1,ldots, y_n$ in ${-1,1}^d$ that satisfy the following:
          $||y_i-y_j||_1 in theta(d)$ for each $i not = j$, where $|| cdot ||_1$ denotes the Manhattan metric.
          [Google error-correcting codes]



          Then for each $i not = j$, the following holds:



          $$frac{1}{||y_i-y_j||_2} leq frac{1}{csqrt{d}}$$



          for some $c in Omega(1)$, which implies the following:




          Inequality 1: $$sum_{1 le i < j le n} frac{1}{||y_i-y_j||_2} leq frac{n^2}{2csqrt{d}}$$




          However. not that for any distinct $x,y$, the following holds:




          Inequality 2: $$frac{1}{||x-y||_2} ge frac{1}{sqrt{d}} $$




          So Inequalities 1 and 2 together imply that for $y_1,ldots, y_n$ as above,



          $sum_{1 le i < j le n} frac{1}{||y_i-y_j||_2}$ is no more than a constant multiple larger than $sum_{1 le i < j le n} frac{1}{||x_i-x_j||_2}$ for any other $x_1,ldots, x_n in [-1,1]^d$.







          share|cite|improve this answer












          share|cite|improve this answer



          share|cite|improve this answer










          answered Dec 27 '18 at 19:16









          Mike

          3,148211




          3,148211























              1














              (Update below)



              The situation seems to be less interesting asymptotically for fixed $d geq 2$. We have
              $$frac{1}{||x-y||_2} geq frac{1}{2sqrt{d}}$$ for any $x, y in [-1, 1]^d$, so there's an easy lower bound of
              $$sum_{1 leq j < k leq n} frac{1}{||x_k - x_j||} geq frac{1}{2sqrt{d}} binom{n}{2} = Omega(n^2).$$ This lower bound is actually achieved (up to a constant) by letting the $x_j$ lie on a grid, say $Lambda^+ = {n^{-1/d}, 2n^{-1/d}, dots, 1}^d$.



              First note that if we set
              $$Lambda = {-1, dots, -2n^{-1/d}, -n^{-1/d}, 0, n^{-1/d}, 2n^{-1/d}, dots, 1}^d$$
              we have
              $$sum_{j neq k} frac{1}{||x_k - x_j||} < sum_{v in Lambda, v neq 0} frac{1}{||v||}$$
              for each $k$.



              Also, since $d geq 2$, the integral
              $$int_{[-1, 1]^d} frac{dx}{||x||}$$
              actually converges, say to some $I_d < infty$, meaning in particular that
              $$frac{1}{n} sum_{v in Lambda, v neq 0} frac{1}{||v||} to I_d$$
              (this is just a standard grid approximation).



              Putting these together, we get
              $$sum_{1 leq j < k leq n} frac{1}{||x_k - x_j||} = frac{1}{2} sum_k sum_{j neq k} frac{1}{||x_k - x_j||} < frac{n}{2} sum_{v in Lambda, v neq 0} frac{1}{||v||} sim frac{n^2}{2} I_d$$
              so our sum is $O(n^2)$, matching the lower bound.



              Update:



              It's possible to prove more. Fix some positive integer $d$ and real $p > 0$. Then we can show that for $x_1, dots, x_n in [-1, 1]^d$, the following tight asymptotic bounds hold: when $p < d$, we have
              $$ sum_{1 leq j < k leq n} frac{1}{||x_k - x_j||^p} geq Omega(n^2)$$
              and when $p > d$,
              $$ sum_{1 leq j < k leq n} frac{1}{||x_k - x_j||^p} geq Omega(n^{1 + frac{p}{d}})$$
              and both of these bounds are achieved by taking ${x_1, dots, x_n} = Lambda^+$. So far I haven't been able to find a tight lower bound for the case $p = d$.



              The proof for the case $p < d$ is the same as the proof above, using the fact that the integral
              $$int_{[-1, 1]^d} frac{dx}{||x||^p}$$
              converges.



              To prove the lower bound for $p > d$: surround each $x_k$ with a cube $C_k = x_k + [-5n^{-1/d}, 5n^{-1/d}]^d$ of side length $10n^{-1/d}$. If at least $n/2$ of these cubes $C_k$ were disjoint from all other $C_j$, then these $n/2$ cubes, all disjoint, would have a total volume of $10^d/2$, but would all be contained in $[-1-5n^{-1/d}, 1+5n^{-1/d}]$, which has volume at most $4^d$ for sufficiently large $n$, a contradiction. This means we have at least $n/2$ cubes, each of which intersects some other cube, so we get at least $n/2$ ordered pairs $(x_k, x_j)$, each with $||x_k - x_j|| < 10 n^{-1/d} sqrt{d}$, hence at least $n/4$ such pairs with $j < k$, so
              $$sum_{1 leq j < k leq n} frac{1}{||x_k - x_j||^p} geq frac{n}{4}left(frac{n^{p/d}}{10^p d^{p/2}}right) = Omega(n^{1 + p/d}).$$



              To evaluate at ${x_1, dots, x_n} = Lambda^+$, note that for any $k$,
              $$sum_{1 leq j < k leq n} frac{1}{||x_k - x_j||^p} < n^{p/d} cdot sum_{v in mathbb{Z}^d, v neq 0} frac{1}{||v||^p}$$
              but the sum on the RHS converges to some $S_{d, p} < infty$, since the integral
              $$int_{x in mathbb{R}^d, ||x|| geq 1} frac{dx}{||x||^p}$$
              converges, so
              $$sum_{1 leq j < k leq n} frac{1}{||x_k - x_j||^p} = frac{1}{2} sum_k sum_{j neq k} < frac{n}{2} cdot n^{p/d} S_{d, p} = O(n^{1 + p/d}).$$



              In the case $p = d$, evaluating the sum on $Lambda^+$ gives a result of $Theta(n^2 log n)$, but I don't know how to show this is optimal for $d > 1$.






              share|cite|improve this answer




























                1














                (Update below)



                The situation seems to be less interesting asymptotically for fixed $d geq 2$. We have
                $$frac{1}{||x-y||_2} geq frac{1}{2sqrt{d}}$$ for any $x, y in [-1, 1]^d$, so there's an easy lower bound of
                $$sum_{1 leq j < k leq n} frac{1}{||x_k - x_j||} geq frac{1}{2sqrt{d}} binom{n}{2} = Omega(n^2).$$ This lower bound is actually achieved (up to a constant) by letting the $x_j$ lie on a grid, say $Lambda^+ = {n^{-1/d}, 2n^{-1/d}, dots, 1}^d$.



                First note that if we set
                $$Lambda = {-1, dots, -2n^{-1/d}, -n^{-1/d}, 0, n^{-1/d}, 2n^{-1/d}, dots, 1}^d$$
                we have
                $$sum_{j neq k} frac{1}{||x_k - x_j||} < sum_{v in Lambda, v neq 0} frac{1}{||v||}$$
                for each $k$.



                Also, since $d geq 2$, the integral
                $$int_{[-1, 1]^d} frac{dx}{||x||}$$
                actually converges, say to some $I_d < infty$, meaning in particular that
                $$frac{1}{n} sum_{v in Lambda, v neq 0} frac{1}{||v||} to I_d$$
                (this is just a standard grid approximation).



                Putting these together, we get
                $$sum_{1 leq j < k leq n} frac{1}{||x_k - x_j||} = frac{1}{2} sum_k sum_{j neq k} frac{1}{||x_k - x_j||} < frac{n}{2} sum_{v in Lambda, v neq 0} frac{1}{||v||} sim frac{n^2}{2} I_d$$
                so our sum is $O(n^2)$, matching the lower bound.



                Update:



                It's possible to prove more. Fix some positive integer $d$ and real $p > 0$. Then we can show that for $x_1, dots, x_n in [-1, 1]^d$, the following tight asymptotic bounds hold: when $p < d$, we have
                $$ sum_{1 leq j < k leq n} frac{1}{||x_k - x_j||^p} geq Omega(n^2)$$
                and when $p > d$,
                $$ sum_{1 leq j < k leq n} frac{1}{||x_k - x_j||^p} geq Omega(n^{1 + frac{p}{d}})$$
                and both of these bounds are achieved by taking ${x_1, dots, x_n} = Lambda^+$. So far I haven't been able to find a tight lower bound for the case $p = d$.



                The proof for the case $p < d$ is the same as the proof above, using the fact that the integral
                $$int_{[-1, 1]^d} frac{dx}{||x||^p}$$
                converges.



                To prove the lower bound for $p > d$: surround each $x_k$ with a cube $C_k = x_k + [-5n^{-1/d}, 5n^{-1/d}]^d$ of side length $10n^{-1/d}$. If at least $n/2$ of these cubes $C_k$ were disjoint from all other $C_j$, then these $n/2$ cubes, all disjoint, would have a total volume of $10^d/2$, but would all be contained in $[-1-5n^{-1/d}, 1+5n^{-1/d}]$, which has volume at most $4^d$ for sufficiently large $n$, a contradiction. This means we have at least $n/2$ cubes, each of which intersects some other cube, so we get at least $n/2$ ordered pairs $(x_k, x_j)$, each with $||x_k - x_j|| < 10 n^{-1/d} sqrt{d}$, hence at least $n/4$ such pairs with $j < k$, so
                $$sum_{1 leq j < k leq n} frac{1}{||x_k - x_j||^p} geq frac{n}{4}left(frac{n^{p/d}}{10^p d^{p/2}}right) = Omega(n^{1 + p/d}).$$



                To evaluate at ${x_1, dots, x_n} = Lambda^+$, note that for any $k$,
                $$sum_{1 leq j < k leq n} frac{1}{||x_k - x_j||^p} < n^{p/d} cdot sum_{v in mathbb{Z}^d, v neq 0} frac{1}{||v||^p}$$
                but the sum on the RHS converges to some $S_{d, p} < infty$, since the integral
                $$int_{x in mathbb{R}^d, ||x|| geq 1} frac{dx}{||x||^p}$$
                converges, so
                $$sum_{1 leq j < k leq n} frac{1}{||x_k - x_j||^p} = frac{1}{2} sum_k sum_{j neq k} < frac{n}{2} cdot n^{p/d} S_{d, p} = O(n^{1 + p/d}).$$



                In the case $p = d$, evaluating the sum on $Lambda^+$ gives a result of $Theta(n^2 log n)$, but I don't know how to show this is optimal for $d > 1$.






                share|cite|improve this answer


























                  1












                  1








                  1






                  (Update below)



                  The situation seems to be less interesting asymptotically for fixed $d geq 2$. We have
                  $$frac{1}{||x-y||_2} geq frac{1}{2sqrt{d}}$$ for any $x, y in [-1, 1]^d$, so there's an easy lower bound of
                  $$sum_{1 leq j < k leq n} frac{1}{||x_k - x_j||} geq frac{1}{2sqrt{d}} binom{n}{2} = Omega(n^2).$$ This lower bound is actually achieved (up to a constant) by letting the $x_j$ lie on a grid, say $Lambda^+ = {n^{-1/d}, 2n^{-1/d}, dots, 1}^d$.



                  First note that if we set
                  $$Lambda = {-1, dots, -2n^{-1/d}, -n^{-1/d}, 0, n^{-1/d}, 2n^{-1/d}, dots, 1}^d$$
                  we have
                  $$sum_{j neq k} frac{1}{||x_k - x_j||} < sum_{v in Lambda, v neq 0} frac{1}{||v||}$$
                  for each $k$.



                  Also, since $d geq 2$, the integral
                  $$int_{[-1, 1]^d} frac{dx}{||x||}$$
                  actually converges, say to some $I_d < infty$, meaning in particular that
                  $$frac{1}{n} sum_{v in Lambda, v neq 0} frac{1}{||v||} to I_d$$
                  (this is just a standard grid approximation).



                  Putting these together, we get
                  $$sum_{1 leq j < k leq n} frac{1}{||x_k - x_j||} = frac{1}{2} sum_k sum_{j neq k} frac{1}{||x_k - x_j||} < frac{n}{2} sum_{v in Lambda, v neq 0} frac{1}{||v||} sim frac{n^2}{2} I_d$$
                  so our sum is $O(n^2)$, matching the lower bound.



                  Update:



                  It's possible to prove more. Fix some positive integer $d$ and real $p > 0$. Then we can show that for $x_1, dots, x_n in [-1, 1]^d$, the following tight asymptotic bounds hold: when $p < d$, we have
                  $$ sum_{1 leq j < k leq n} frac{1}{||x_k - x_j||^p} geq Omega(n^2)$$
                  and when $p > d$,
                  $$ sum_{1 leq j < k leq n} frac{1}{||x_k - x_j||^p} geq Omega(n^{1 + frac{p}{d}})$$
                  and both of these bounds are achieved by taking ${x_1, dots, x_n} = Lambda^+$. So far I haven't been able to find a tight lower bound for the case $p = d$.



                  The proof for the case $p < d$ is the same as the proof above, using the fact that the integral
                  $$int_{[-1, 1]^d} frac{dx}{||x||^p}$$
                  converges.



                  To prove the lower bound for $p > d$: surround each $x_k$ with a cube $C_k = x_k + [-5n^{-1/d}, 5n^{-1/d}]^d$ of side length $10n^{-1/d}$. If at least $n/2$ of these cubes $C_k$ were disjoint from all other $C_j$, then these $n/2$ cubes, all disjoint, would have a total volume of $10^d/2$, but would all be contained in $[-1-5n^{-1/d}, 1+5n^{-1/d}]$, which has volume at most $4^d$ for sufficiently large $n$, a contradiction. This means we have at least $n/2$ cubes, each of which intersects some other cube, so we get at least $n/2$ ordered pairs $(x_k, x_j)$, each with $||x_k - x_j|| < 10 n^{-1/d} sqrt{d}$, hence at least $n/4$ such pairs with $j < k$, so
                  $$sum_{1 leq j < k leq n} frac{1}{||x_k - x_j||^p} geq frac{n}{4}left(frac{n^{p/d}}{10^p d^{p/2}}right) = Omega(n^{1 + p/d}).$$



                  To evaluate at ${x_1, dots, x_n} = Lambda^+$, note that for any $k$,
                  $$sum_{1 leq j < k leq n} frac{1}{||x_k - x_j||^p} < n^{p/d} cdot sum_{v in mathbb{Z}^d, v neq 0} frac{1}{||v||^p}$$
                  but the sum on the RHS converges to some $S_{d, p} < infty$, since the integral
                  $$int_{x in mathbb{R}^d, ||x|| geq 1} frac{dx}{||x||^p}$$
                  converges, so
                  $$sum_{1 leq j < k leq n} frac{1}{||x_k - x_j||^p} = frac{1}{2} sum_k sum_{j neq k} < frac{n}{2} cdot n^{p/d} S_{d, p} = O(n^{1 + p/d}).$$



                  In the case $p = d$, evaluating the sum on $Lambda^+$ gives a result of $Theta(n^2 log n)$, but I don't know how to show this is optimal for $d > 1$.






                  share|cite|improve this answer














                  (Update below)



                  The situation seems to be less interesting asymptotically for fixed $d geq 2$. We have
                  $$frac{1}{||x-y||_2} geq frac{1}{2sqrt{d}}$$ for any $x, y in [-1, 1]^d$, so there's an easy lower bound of
                  $$sum_{1 leq j < k leq n} frac{1}{||x_k - x_j||} geq frac{1}{2sqrt{d}} binom{n}{2} = Omega(n^2).$$ This lower bound is actually achieved (up to a constant) by letting the $x_j$ lie on a grid, say $Lambda^+ = {n^{-1/d}, 2n^{-1/d}, dots, 1}^d$.



                  First note that if we set
                  $$Lambda = {-1, dots, -2n^{-1/d}, -n^{-1/d}, 0, n^{-1/d}, 2n^{-1/d}, dots, 1}^d$$
                  we have
                  $$sum_{j neq k} frac{1}{||x_k - x_j||} < sum_{v in Lambda, v neq 0} frac{1}{||v||}$$
                  for each $k$.



                  Also, since $d geq 2$, the integral
                  $$int_{[-1, 1]^d} frac{dx}{||x||}$$
                  actually converges, say to some $I_d < infty$, meaning in particular that
                  $$frac{1}{n} sum_{v in Lambda, v neq 0} frac{1}{||v||} to I_d$$
                  (this is just a standard grid approximation).



                  Putting these together, we get
                  $$sum_{1 leq j < k leq n} frac{1}{||x_k - x_j||} = frac{1}{2} sum_k sum_{j neq k} frac{1}{||x_k - x_j||} < frac{n}{2} sum_{v in Lambda, v neq 0} frac{1}{||v||} sim frac{n^2}{2} I_d$$
                  so our sum is $O(n^2)$, matching the lower bound.



                  Update:



                  It's possible to prove more. Fix some positive integer $d$ and real $p > 0$. Then we can show that for $x_1, dots, x_n in [-1, 1]^d$, the following tight asymptotic bounds hold: when $p < d$, we have
                  $$ sum_{1 leq j < k leq n} frac{1}{||x_k - x_j||^p} geq Omega(n^2)$$
                  and when $p > d$,
                  $$ sum_{1 leq j < k leq n} frac{1}{||x_k - x_j||^p} geq Omega(n^{1 + frac{p}{d}})$$
                  and both of these bounds are achieved by taking ${x_1, dots, x_n} = Lambda^+$. So far I haven't been able to find a tight lower bound for the case $p = d$.



                  The proof for the case $p < d$ is the same as the proof above, using the fact that the integral
                  $$int_{[-1, 1]^d} frac{dx}{||x||^p}$$
                  converges.



                  To prove the lower bound for $p > d$: surround each $x_k$ with a cube $C_k = x_k + [-5n^{-1/d}, 5n^{-1/d}]^d$ of side length $10n^{-1/d}$. If at least $n/2$ of these cubes $C_k$ were disjoint from all other $C_j$, then these $n/2$ cubes, all disjoint, would have a total volume of $10^d/2$, but would all be contained in $[-1-5n^{-1/d}, 1+5n^{-1/d}]$, which has volume at most $4^d$ for sufficiently large $n$, a contradiction. This means we have at least $n/2$ cubes, each of which intersects some other cube, so we get at least $n/2$ ordered pairs $(x_k, x_j)$, each with $||x_k - x_j|| < 10 n^{-1/d} sqrt{d}$, hence at least $n/4$ such pairs with $j < k$, so
                  $$sum_{1 leq j < k leq n} frac{1}{||x_k - x_j||^p} geq frac{n}{4}left(frac{n^{p/d}}{10^p d^{p/2}}right) = Omega(n^{1 + p/d}).$$



                  To evaluate at ${x_1, dots, x_n} = Lambda^+$, note that for any $k$,
                  $$sum_{1 leq j < k leq n} frac{1}{||x_k - x_j||^p} < n^{p/d} cdot sum_{v in mathbb{Z}^d, v neq 0} frac{1}{||v||^p}$$
                  but the sum on the RHS converges to some $S_{d, p} < infty$, since the integral
                  $$int_{x in mathbb{R}^d, ||x|| geq 1} frac{dx}{||x||^p}$$
                  converges, so
                  $$sum_{1 leq j < k leq n} frac{1}{||x_k - x_j||^p} = frac{1}{2} sum_k sum_{j neq k} < frac{n}{2} cdot n^{p/d} S_{d, p} = O(n^{1 + p/d}).$$



                  In the case $p = d$, evaluating the sum on $Lambda^+$ gives a result of $Theta(n^2 log n)$, but I don't know how to show this is optimal for $d > 1$.







                  share|cite|improve this answer














                  share|cite|improve this answer



                  share|cite|improve this answer








                  edited 2 days ago

























                  answered Jan 3 at 13:13









                  user125932

                  58529




                  58529






























                      draft saved

                      draft discarded




















































                      Thanks for contributing an answer to Mathematics Stack Exchange!


                      • Please be sure to answer the question. Provide details and share your research!

                      But avoid



                      • Asking for help, clarification, or responding to other answers.

                      • Making statements based on opinion; back them up with references or personal experience.


                      Use MathJax to format equations. MathJax reference.


                      To learn more, see our tips on writing great answers.





                      Some of your past answers have not been well-received, and you're in danger of being blocked from answering.


                      Please pay close attention to the following guidance:


                      • Please be sure to answer the question. Provide details and share your research!

                      But avoid



                      • Asking for help, clarification, or responding to other answers.

                      • Making statements based on opinion; back them up with references or personal experience.


                      To learn more, see our tips on writing great answers.




                      draft saved


                      draft discarded














                      StackExchange.ready(
                      function () {
                      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3050869%2fquantifying-how-crowded-points-are-in-d-dimensional-cube%23new-answer', 'question_page');
                      }
                      );

                      Post as a guest















                      Required, but never shown





















































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown

































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown







                      Popular posts from this blog

                      Mario Kart Wii

                      What does “Dominus providebit” mean?

                      The Binding of Isaac: Rebirth/Afterbirth