Sum of $n+1$ power terms is infinite geometric series plus $O(r^{n+1})$












0












$begingroup$


I want to show that for $|r|<1$, $$sumlimits_{k=0}^n r^k =frac{1}{1-r}+mathcal{O}(r^{n+1})$$



But I'm not certain if my approach is actually correct. Here is how I show it:



$$sumlimits_{k=0}^n r^k = frac{1-r^{n+1}}{1-r}=frac{1}{1-r}+frac{r^{n+1}}{r-1}$$
Now, $$left|frac{r^{n+1}}{r-1}right|lefrac{1}{1-r}text{sgn}(r^{n+1})r^{n+1}$$



My constant in this case is $frac{1}{1-r}$, but my function is not exactly $r^{n+1}$, which, however, I don't think actually matters. Please let me know what you think.










share|cite|improve this question









$endgroup$

















    0












    $begingroup$


    I want to show that for $|r|<1$, $$sumlimits_{k=0}^n r^k =frac{1}{1-r}+mathcal{O}(r^{n+1})$$



    But I'm not certain if my approach is actually correct. Here is how I show it:



    $$sumlimits_{k=0}^n r^k = frac{1-r^{n+1}}{1-r}=frac{1}{1-r}+frac{r^{n+1}}{r-1}$$
    Now, $$left|frac{r^{n+1}}{r-1}right|lefrac{1}{1-r}text{sgn}(r^{n+1})r^{n+1}$$



    My constant in this case is $frac{1}{1-r}$, but my function is not exactly $r^{n+1}$, which, however, I don't think actually matters. Please let me know what you think.










    share|cite|improve this question









    $endgroup$















      0












      0








      0





      $begingroup$


      I want to show that for $|r|<1$, $$sumlimits_{k=0}^n r^k =frac{1}{1-r}+mathcal{O}(r^{n+1})$$



      But I'm not certain if my approach is actually correct. Here is how I show it:



      $$sumlimits_{k=0}^n r^k = frac{1-r^{n+1}}{1-r}=frac{1}{1-r}+frac{r^{n+1}}{r-1}$$
      Now, $$left|frac{r^{n+1}}{r-1}right|lefrac{1}{1-r}text{sgn}(r^{n+1})r^{n+1}$$



      My constant in this case is $frac{1}{1-r}$, but my function is not exactly $r^{n+1}$, which, however, I don't think actually matters. Please let me know what you think.










      share|cite|improve this question









      $endgroup$




      I want to show that for $|r|<1$, $$sumlimits_{k=0}^n r^k =frac{1}{1-r}+mathcal{O}(r^{n+1})$$



      But I'm not certain if my approach is actually correct. Here is how I show it:



      $$sumlimits_{k=0}^n r^k = frac{1-r^{n+1}}{1-r}=frac{1}{1-r}+frac{r^{n+1}}{r-1}$$
      Now, $$left|frac{r^{n+1}}{r-1}right|lefrac{1}{1-r}text{sgn}(r^{n+1})r^{n+1}$$



      My constant in this case is $frac{1}{1-r}$, but my function is not exactly $r^{n+1}$, which, however, I don't think actually matters. Please let me know what you think.







      real-analysis proof-verification asymptotics






      share|cite|improve this question













      share|cite|improve this question











      share|cite|improve this question




      share|cite|improve this question










      asked Jan 25 at 3:37









      sequencesequence

      4,26131437




      4,26131437






















          2 Answers
          2






          active

          oldest

          votes


















          1












          $begingroup$

          First, let us clarify what big-oh notation means: we'll say $f(x)=O(g(x))$ if $|f(x)|leq Ccdot |g(x)|$ for some fixed $C>0$. To extend it to "equality" we have: $f(x)=g(x)+O(h(x))$ if $|f(x)-g(x)|leq Ccdot |h(x)|$ for some fixed $C>0$.



          Now, assuming $r$ is not fixed, what you're trying to prove is false unless you know $|r|leq 1-varepsilon$ for some given $varepsilon>0$ (really, you only need $rin[-1,1-varepsilon]$ ). The reason is that, as you have already shown, you have $$sum_{k=0}^n r^k=frac{1}{1-r}+frac{r^{n+1}}{r-1}.$$ That is, $$left|frac{1}{1-r}-sum_{k=0}^n r^kright|leqfrac{r^{n+1}}{1-r}.$$ There is no way to eliminate the $1-r$ in the denominator, so for any fixed $C>0$, to show that the bound doesn't hold, all we need to do is take $r$ close enough to $1$ (this is why if we know $rleq1-varepsilon$, we can provide a constant that depends on $varepsilon$). However, this does show that $$sum_{k=0}^n r^k = frac{1}{1-r}+Oleft(frac{r^{n+1}}{1-r}right).$$



          However, if $r$ is fixed, then $|1-r|$ is a constant, so the big-oh term above becomes $O(r^{n+1})$ because $|1-r|$ can be absorbed into the constant.






          share|cite|improve this answer









          $endgroup$













          • $begingroup$
            This is the first time I'm seeing a big-oh definition involving the absolute value on RHS. Even though this does make sense to me, usually there is no absolute value in this definition. This makes sense because if $f(x) = c$ for some negative constant $c$ then $f=O(f)$ only when $c$ is taken in absolute value. Moreover, this definition would work for complex functions as well. But I'm not sure if I can use it nevertheless.
            $endgroup$
            – sequence
            Jan 25 at 16:50










          • $begingroup$
            So what if $r$ is a negative or complex constant, I think your definition will work, but in my book and in many other places the definition does not involve the absolute value on the RHS. This is confusing.
            $endgroup$
            – sequence
            Jan 25 at 16:51






          • 1




            $begingroup$
            @sequence: Wikipedia uses the notation this way; the article states that $g(x)$ is real-valued, but the only way that we can have $|f(x)|leq Mg(x)$ is if $g(x)geq0$ for all $xgeq x_0$. Even MIT uses the notation this way, explicitly using absolute values on the RHS; in every context I've ever seen big-O notation, it has been implicit that the function inside $O(cdot)$ is positive.
            $endgroup$
            – Clayton
            Jan 25 at 17:30








          • 1




            $begingroup$
            @sequence: If $r$ is negative or complex, the above estimates hold exactly as stated.
            $endgroup$
            – Clayton
            Jan 25 at 17:32










          • $begingroup$
            In the definitions I've seen it is assumed that $g(x)$ is positive for all $xge x_0$, where $x_0$ is some threshold value. So I think in this case it makes sense to use the absolute value. The only possible issue here, however, is that $g(x)$ and $|g(x)|$ are different functions in general. However, if we assume $tilde{g}(x) = |g(x)|$ then these definitions should be equivalent.
            $endgroup$
            – sequence
            Jan 28 at 2:38





















          0












          $begingroup$

          If you're trying to find $cin mathbb{R}_{>0}$ such that



          $$left| frac{r^{n+1}}{r-1} right| le cr^{n+1}$$ for all $n ge n_0$ for some $n_0$, then you can't for $r < 0$, as the LHS is always at least $0$, while the RHS is less than $0$ infinitely often due to the sign alternation.






          share|cite|improve this answer









          $endgroup$













          • $begingroup$
            That's why I'm using the sign function in my proof.
            $endgroup$
            – sequence
            Jan 25 at 4:27










          • $begingroup$
            Yes, so you're not using a constant, as the sgn applied to $r^{n+1}$ will alternate.
            $endgroup$
            – Metric
            Jan 25 at 4:28








          • 1




            $begingroup$
            Have to sleep now, but in case I wasn't clear in the above, LHS denotes the Left Hand Side of the inequality, and RHS is abbreviated analogously. The main point of the above is that no what what positive constant $c$ you choose, the RHS will be $< 0$ infinitely often, while the LHS is always $ge 0$, so the inequality above can never hold.
            $endgroup$
            – Metric
            Jan 25 at 4:45












          • $begingroup$
            I see now. Thanks for clarifying. @Metric
            $endgroup$
            – sequence
            Jan 25 at 5:25






          • 1




            $begingroup$
            This is not correct; big-oh notation always takes absolute values, so you need not worry about sign changes.
            $endgroup$
            – Clayton
            Jan 25 at 13:06











          Your Answer





          StackExchange.ifUsing("editor", function () {
          return StackExchange.using("mathjaxEditing", function () {
          StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
          StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
          });
          });
          }, "mathjax-editing");

          StackExchange.ready(function() {
          var channelOptions = {
          tags: "".split(" "),
          id: "69"
          };
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function() {
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled) {
          StackExchange.using("snippets", function() {
          createEditor();
          });
          }
          else {
          createEditor();
          }
          });

          function createEditor() {
          StackExchange.prepareEditor({
          heartbeatType: 'answer',
          autoActivateHeartbeat: false,
          convertImagesToLinks: true,
          noModals: true,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: 10,
          bindNavPrevention: true,
          postfix: "",
          imageUploader: {
          brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
          contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
          allowUrls: true
          },
          noCode: true, onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          });


          }
          });














          draft saved

          draft discarded


















          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3086671%2fsum-of-n1-power-terms-is-infinite-geometric-series-plus-orn1%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown

























          2 Answers
          2






          active

          oldest

          votes








          2 Answers
          2






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes









          1












          $begingroup$

          First, let us clarify what big-oh notation means: we'll say $f(x)=O(g(x))$ if $|f(x)|leq Ccdot |g(x)|$ for some fixed $C>0$. To extend it to "equality" we have: $f(x)=g(x)+O(h(x))$ if $|f(x)-g(x)|leq Ccdot |h(x)|$ for some fixed $C>0$.



          Now, assuming $r$ is not fixed, what you're trying to prove is false unless you know $|r|leq 1-varepsilon$ for some given $varepsilon>0$ (really, you only need $rin[-1,1-varepsilon]$ ). The reason is that, as you have already shown, you have $$sum_{k=0}^n r^k=frac{1}{1-r}+frac{r^{n+1}}{r-1}.$$ That is, $$left|frac{1}{1-r}-sum_{k=0}^n r^kright|leqfrac{r^{n+1}}{1-r}.$$ There is no way to eliminate the $1-r$ in the denominator, so for any fixed $C>0$, to show that the bound doesn't hold, all we need to do is take $r$ close enough to $1$ (this is why if we know $rleq1-varepsilon$, we can provide a constant that depends on $varepsilon$). However, this does show that $$sum_{k=0}^n r^k = frac{1}{1-r}+Oleft(frac{r^{n+1}}{1-r}right).$$



          However, if $r$ is fixed, then $|1-r|$ is a constant, so the big-oh term above becomes $O(r^{n+1})$ because $|1-r|$ can be absorbed into the constant.






          share|cite|improve this answer









          $endgroup$













          • $begingroup$
            This is the first time I'm seeing a big-oh definition involving the absolute value on RHS. Even though this does make sense to me, usually there is no absolute value in this definition. This makes sense because if $f(x) = c$ for some negative constant $c$ then $f=O(f)$ only when $c$ is taken in absolute value. Moreover, this definition would work for complex functions as well. But I'm not sure if I can use it nevertheless.
            $endgroup$
            – sequence
            Jan 25 at 16:50










          • $begingroup$
            So what if $r$ is a negative or complex constant, I think your definition will work, but in my book and in many other places the definition does not involve the absolute value on the RHS. This is confusing.
            $endgroup$
            – sequence
            Jan 25 at 16:51






          • 1




            $begingroup$
            @sequence: Wikipedia uses the notation this way; the article states that $g(x)$ is real-valued, but the only way that we can have $|f(x)|leq Mg(x)$ is if $g(x)geq0$ for all $xgeq x_0$. Even MIT uses the notation this way, explicitly using absolute values on the RHS; in every context I've ever seen big-O notation, it has been implicit that the function inside $O(cdot)$ is positive.
            $endgroup$
            – Clayton
            Jan 25 at 17:30








          • 1




            $begingroup$
            @sequence: If $r$ is negative or complex, the above estimates hold exactly as stated.
            $endgroup$
            – Clayton
            Jan 25 at 17:32










          • $begingroup$
            In the definitions I've seen it is assumed that $g(x)$ is positive for all $xge x_0$, where $x_0$ is some threshold value. So I think in this case it makes sense to use the absolute value. The only possible issue here, however, is that $g(x)$ and $|g(x)|$ are different functions in general. However, if we assume $tilde{g}(x) = |g(x)|$ then these definitions should be equivalent.
            $endgroup$
            – sequence
            Jan 28 at 2:38


















          1












          $begingroup$

          First, let us clarify what big-oh notation means: we'll say $f(x)=O(g(x))$ if $|f(x)|leq Ccdot |g(x)|$ for some fixed $C>0$. To extend it to "equality" we have: $f(x)=g(x)+O(h(x))$ if $|f(x)-g(x)|leq Ccdot |h(x)|$ for some fixed $C>0$.



          Now, assuming $r$ is not fixed, what you're trying to prove is false unless you know $|r|leq 1-varepsilon$ for some given $varepsilon>0$ (really, you only need $rin[-1,1-varepsilon]$ ). The reason is that, as you have already shown, you have $$sum_{k=0}^n r^k=frac{1}{1-r}+frac{r^{n+1}}{r-1}.$$ That is, $$left|frac{1}{1-r}-sum_{k=0}^n r^kright|leqfrac{r^{n+1}}{1-r}.$$ There is no way to eliminate the $1-r$ in the denominator, so for any fixed $C>0$, to show that the bound doesn't hold, all we need to do is take $r$ close enough to $1$ (this is why if we know $rleq1-varepsilon$, we can provide a constant that depends on $varepsilon$). However, this does show that $$sum_{k=0}^n r^k = frac{1}{1-r}+Oleft(frac{r^{n+1}}{1-r}right).$$



          However, if $r$ is fixed, then $|1-r|$ is a constant, so the big-oh term above becomes $O(r^{n+1})$ because $|1-r|$ can be absorbed into the constant.






          share|cite|improve this answer









          $endgroup$













          • $begingroup$
            This is the first time I'm seeing a big-oh definition involving the absolute value on RHS. Even though this does make sense to me, usually there is no absolute value in this definition. This makes sense because if $f(x) = c$ for some negative constant $c$ then $f=O(f)$ only when $c$ is taken in absolute value. Moreover, this definition would work for complex functions as well. But I'm not sure if I can use it nevertheless.
            $endgroup$
            – sequence
            Jan 25 at 16:50










          • $begingroup$
            So what if $r$ is a negative or complex constant, I think your definition will work, but in my book and in many other places the definition does not involve the absolute value on the RHS. This is confusing.
            $endgroup$
            – sequence
            Jan 25 at 16:51






          • 1




            $begingroup$
            @sequence: Wikipedia uses the notation this way; the article states that $g(x)$ is real-valued, but the only way that we can have $|f(x)|leq Mg(x)$ is if $g(x)geq0$ for all $xgeq x_0$. Even MIT uses the notation this way, explicitly using absolute values on the RHS; in every context I've ever seen big-O notation, it has been implicit that the function inside $O(cdot)$ is positive.
            $endgroup$
            – Clayton
            Jan 25 at 17:30








          • 1




            $begingroup$
            @sequence: If $r$ is negative or complex, the above estimates hold exactly as stated.
            $endgroup$
            – Clayton
            Jan 25 at 17:32










          • $begingroup$
            In the definitions I've seen it is assumed that $g(x)$ is positive for all $xge x_0$, where $x_0$ is some threshold value. So I think in this case it makes sense to use the absolute value. The only possible issue here, however, is that $g(x)$ and $|g(x)|$ are different functions in general. However, if we assume $tilde{g}(x) = |g(x)|$ then these definitions should be equivalent.
            $endgroup$
            – sequence
            Jan 28 at 2:38
















          1












          1








          1





          $begingroup$

          First, let us clarify what big-oh notation means: we'll say $f(x)=O(g(x))$ if $|f(x)|leq Ccdot |g(x)|$ for some fixed $C>0$. To extend it to "equality" we have: $f(x)=g(x)+O(h(x))$ if $|f(x)-g(x)|leq Ccdot |h(x)|$ for some fixed $C>0$.



          Now, assuming $r$ is not fixed, what you're trying to prove is false unless you know $|r|leq 1-varepsilon$ for some given $varepsilon>0$ (really, you only need $rin[-1,1-varepsilon]$ ). The reason is that, as you have already shown, you have $$sum_{k=0}^n r^k=frac{1}{1-r}+frac{r^{n+1}}{r-1}.$$ That is, $$left|frac{1}{1-r}-sum_{k=0}^n r^kright|leqfrac{r^{n+1}}{1-r}.$$ There is no way to eliminate the $1-r$ in the denominator, so for any fixed $C>0$, to show that the bound doesn't hold, all we need to do is take $r$ close enough to $1$ (this is why if we know $rleq1-varepsilon$, we can provide a constant that depends on $varepsilon$). However, this does show that $$sum_{k=0}^n r^k = frac{1}{1-r}+Oleft(frac{r^{n+1}}{1-r}right).$$



          However, if $r$ is fixed, then $|1-r|$ is a constant, so the big-oh term above becomes $O(r^{n+1})$ because $|1-r|$ can be absorbed into the constant.






          share|cite|improve this answer









          $endgroup$



          First, let us clarify what big-oh notation means: we'll say $f(x)=O(g(x))$ if $|f(x)|leq Ccdot |g(x)|$ for some fixed $C>0$. To extend it to "equality" we have: $f(x)=g(x)+O(h(x))$ if $|f(x)-g(x)|leq Ccdot |h(x)|$ for some fixed $C>0$.



          Now, assuming $r$ is not fixed, what you're trying to prove is false unless you know $|r|leq 1-varepsilon$ for some given $varepsilon>0$ (really, you only need $rin[-1,1-varepsilon]$ ). The reason is that, as you have already shown, you have $$sum_{k=0}^n r^k=frac{1}{1-r}+frac{r^{n+1}}{r-1}.$$ That is, $$left|frac{1}{1-r}-sum_{k=0}^n r^kright|leqfrac{r^{n+1}}{1-r}.$$ There is no way to eliminate the $1-r$ in the denominator, so for any fixed $C>0$, to show that the bound doesn't hold, all we need to do is take $r$ close enough to $1$ (this is why if we know $rleq1-varepsilon$, we can provide a constant that depends on $varepsilon$). However, this does show that $$sum_{k=0}^n r^k = frac{1}{1-r}+Oleft(frac{r^{n+1}}{1-r}right).$$



          However, if $r$ is fixed, then $|1-r|$ is a constant, so the big-oh term above becomes $O(r^{n+1})$ because $|1-r|$ can be absorbed into the constant.







          share|cite|improve this answer












          share|cite|improve this answer



          share|cite|improve this answer










          answered Jan 25 at 5:54









          ClaytonClayton

          19.3k33287




          19.3k33287












          • $begingroup$
            This is the first time I'm seeing a big-oh definition involving the absolute value on RHS. Even though this does make sense to me, usually there is no absolute value in this definition. This makes sense because if $f(x) = c$ for some negative constant $c$ then $f=O(f)$ only when $c$ is taken in absolute value. Moreover, this definition would work for complex functions as well. But I'm not sure if I can use it nevertheless.
            $endgroup$
            – sequence
            Jan 25 at 16:50










          • $begingroup$
            So what if $r$ is a negative or complex constant, I think your definition will work, but in my book and in many other places the definition does not involve the absolute value on the RHS. This is confusing.
            $endgroup$
            – sequence
            Jan 25 at 16:51






          • 1




            $begingroup$
            @sequence: Wikipedia uses the notation this way; the article states that $g(x)$ is real-valued, but the only way that we can have $|f(x)|leq Mg(x)$ is if $g(x)geq0$ for all $xgeq x_0$. Even MIT uses the notation this way, explicitly using absolute values on the RHS; in every context I've ever seen big-O notation, it has been implicit that the function inside $O(cdot)$ is positive.
            $endgroup$
            – Clayton
            Jan 25 at 17:30








          • 1




            $begingroup$
            @sequence: If $r$ is negative or complex, the above estimates hold exactly as stated.
            $endgroup$
            – Clayton
            Jan 25 at 17:32










          • $begingroup$
            In the definitions I've seen it is assumed that $g(x)$ is positive for all $xge x_0$, where $x_0$ is some threshold value. So I think in this case it makes sense to use the absolute value. The only possible issue here, however, is that $g(x)$ and $|g(x)|$ are different functions in general. However, if we assume $tilde{g}(x) = |g(x)|$ then these definitions should be equivalent.
            $endgroup$
            – sequence
            Jan 28 at 2:38




















          • $begingroup$
            This is the first time I'm seeing a big-oh definition involving the absolute value on RHS. Even though this does make sense to me, usually there is no absolute value in this definition. This makes sense because if $f(x) = c$ for some negative constant $c$ then $f=O(f)$ only when $c$ is taken in absolute value. Moreover, this definition would work for complex functions as well. But I'm not sure if I can use it nevertheless.
            $endgroup$
            – sequence
            Jan 25 at 16:50










          • $begingroup$
            So what if $r$ is a negative or complex constant, I think your definition will work, but in my book and in many other places the definition does not involve the absolute value on the RHS. This is confusing.
            $endgroup$
            – sequence
            Jan 25 at 16:51






          • 1




            $begingroup$
            @sequence: Wikipedia uses the notation this way; the article states that $g(x)$ is real-valued, but the only way that we can have $|f(x)|leq Mg(x)$ is if $g(x)geq0$ for all $xgeq x_0$. Even MIT uses the notation this way, explicitly using absolute values on the RHS; in every context I've ever seen big-O notation, it has been implicit that the function inside $O(cdot)$ is positive.
            $endgroup$
            – Clayton
            Jan 25 at 17:30








          • 1




            $begingroup$
            @sequence: If $r$ is negative or complex, the above estimates hold exactly as stated.
            $endgroup$
            – Clayton
            Jan 25 at 17:32










          • $begingroup$
            In the definitions I've seen it is assumed that $g(x)$ is positive for all $xge x_0$, where $x_0$ is some threshold value. So I think in this case it makes sense to use the absolute value. The only possible issue here, however, is that $g(x)$ and $|g(x)|$ are different functions in general. However, if we assume $tilde{g}(x) = |g(x)|$ then these definitions should be equivalent.
            $endgroup$
            – sequence
            Jan 28 at 2:38


















          $begingroup$
          This is the first time I'm seeing a big-oh definition involving the absolute value on RHS. Even though this does make sense to me, usually there is no absolute value in this definition. This makes sense because if $f(x) = c$ for some negative constant $c$ then $f=O(f)$ only when $c$ is taken in absolute value. Moreover, this definition would work for complex functions as well. But I'm not sure if I can use it nevertheless.
          $endgroup$
          – sequence
          Jan 25 at 16:50




          $begingroup$
          This is the first time I'm seeing a big-oh definition involving the absolute value on RHS. Even though this does make sense to me, usually there is no absolute value in this definition. This makes sense because if $f(x) = c$ for some negative constant $c$ then $f=O(f)$ only when $c$ is taken in absolute value. Moreover, this definition would work for complex functions as well. But I'm not sure if I can use it nevertheless.
          $endgroup$
          – sequence
          Jan 25 at 16:50












          $begingroup$
          So what if $r$ is a negative or complex constant, I think your definition will work, but in my book and in many other places the definition does not involve the absolute value on the RHS. This is confusing.
          $endgroup$
          – sequence
          Jan 25 at 16:51




          $begingroup$
          So what if $r$ is a negative or complex constant, I think your definition will work, but in my book and in many other places the definition does not involve the absolute value on the RHS. This is confusing.
          $endgroup$
          – sequence
          Jan 25 at 16:51




          1




          1




          $begingroup$
          @sequence: Wikipedia uses the notation this way; the article states that $g(x)$ is real-valued, but the only way that we can have $|f(x)|leq Mg(x)$ is if $g(x)geq0$ for all $xgeq x_0$. Even MIT uses the notation this way, explicitly using absolute values on the RHS; in every context I've ever seen big-O notation, it has been implicit that the function inside $O(cdot)$ is positive.
          $endgroup$
          – Clayton
          Jan 25 at 17:30






          $begingroup$
          @sequence: Wikipedia uses the notation this way; the article states that $g(x)$ is real-valued, but the only way that we can have $|f(x)|leq Mg(x)$ is if $g(x)geq0$ for all $xgeq x_0$. Even MIT uses the notation this way, explicitly using absolute values on the RHS; in every context I've ever seen big-O notation, it has been implicit that the function inside $O(cdot)$ is positive.
          $endgroup$
          – Clayton
          Jan 25 at 17:30






          1




          1




          $begingroup$
          @sequence: If $r$ is negative or complex, the above estimates hold exactly as stated.
          $endgroup$
          – Clayton
          Jan 25 at 17:32




          $begingroup$
          @sequence: If $r$ is negative or complex, the above estimates hold exactly as stated.
          $endgroup$
          – Clayton
          Jan 25 at 17:32












          $begingroup$
          In the definitions I've seen it is assumed that $g(x)$ is positive for all $xge x_0$, where $x_0$ is some threshold value. So I think in this case it makes sense to use the absolute value. The only possible issue here, however, is that $g(x)$ and $|g(x)|$ are different functions in general. However, if we assume $tilde{g}(x) = |g(x)|$ then these definitions should be equivalent.
          $endgroup$
          – sequence
          Jan 28 at 2:38






          $begingroup$
          In the definitions I've seen it is assumed that $g(x)$ is positive for all $xge x_0$, where $x_0$ is some threshold value. So I think in this case it makes sense to use the absolute value. The only possible issue here, however, is that $g(x)$ and $|g(x)|$ are different functions in general. However, if we assume $tilde{g}(x) = |g(x)|$ then these definitions should be equivalent.
          $endgroup$
          – sequence
          Jan 28 at 2:38













          0












          $begingroup$

          If you're trying to find $cin mathbb{R}_{>0}$ such that



          $$left| frac{r^{n+1}}{r-1} right| le cr^{n+1}$$ for all $n ge n_0$ for some $n_0$, then you can't for $r < 0$, as the LHS is always at least $0$, while the RHS is less than $0$ infinitely often due to the sign alternation.






          share|cite|improve this answer









          $endgroup$













          • $begingroup$
            That's why I'm using the sign function in my proof.
            $endgroup$
            – sequence
            Jan 25 at 4:27










          • $begingroup$
            Yes, so you're not using a constant, as the sgn applied to $r^{n+1}$ will alternate.
            $endgroup$
            – Metric
            Jan 25 at 4:28








          • 1




            $begingroup$
            Have to sleep now, but in case I wasn't clear in the above, LHS denotes the Left Hand Side of the inequality, and RHS is abbreviated analogously. The main point of the above is that no what what positive constant $c$ you choose, the RHS will be $< 0$ infinitely often, while the LHS is always $ge 0$, so the inequality above can never hold.
            $endgroup$
            – Metric
            Jan 25 at 4:45












          • $begingroup$
            I see now. Thanks for clarifying. @Metric
            $endgroup$
            – sequence
            Jan 25 at 5:25






          • 1




            $begingroup$
            This is not correct; big-oh notation always takes absolute values, so you need not worry about sign changes.
            $endgroup$
            – Clayton
            Jan 25 at 13:06
















          0












          $begingroup$

          If you're trying to find $cin mathbb{R}_{>0}$ such that



          $$left| frac{r^{n+1}}{r-1} right| le cr^{n+1}$$ for all $n ge n_0$ for some $n_0$, then you can't for $r < 0$, as the LHS is always at least $0$, while the RHS is less than $0$ infinitely often due to the sign alternation.






          share|cite|improve this answer









          $endgroup$













          • $begingroup$
            That's why I'm using the sign function in my proof.
            $endgroup$
            – sequence
            Jan 25 at 4:27










          • $begingroup$
            Yes, so you're not using a constant, as the sgn applied to $r^{n+1}$ will alternate.
            $endgroup$
            – Metric
            Jan 25 at 4:28








          • 1




            $begingroup$
            Have to sleep now, but in case I wasn't clear in the above, LHS denotes the Left Hand Side of the inequality, and RHS is abbreviated analogously. The main point of the above is that no what what positive constant $c$ you choose, the RHS will be $< 0$ infinitely often, while the LHS is always $ge 0$, so the inequality above can never hold.
            $endgroup$
            – Metric
            Jan 25 at 4:45












          • $begingroup$
            I see now. Thanks for clarifying. @Metric
            $endgroup$
            – sequence
            Jan 25 at 5:25






          • 1




            $begingroup$
            This is not correct; big-oh notation always takes absolute values, so you need not worry about sign changes.
            $endgroup$
            – Clayton
            Jan 25 at 13:06














          0












          0








          0





          $begingroup$

          If you're trying to find $cin mathbb{R}_{>0}$ such that



          $$left| frac{r^{n+1}}{r-1} right| le cr^{n+1}$$ for all $n ge n_0$ for some $n_0$, then you can't for $r < 0$, as the LHS is always at least $0$, while the RHS is less than $0$ infinitely often due to the sign alternation.






          share|cite|improve this answer









          $endgroup$



          If you're trying to find $cin mathbb{R}_{>0}$ such that



          $$left| frac{r^{n+1}}{r-1} right| le cr^{n+1}$$ for all $n ge n_0$ for some $n_0$, then you can't for $r < 0$, as the LHS is always at least $0$, while the RHS is less than $0$ infinitely often due to the sign alternation.







          share|cite|improve this answer












          share|cite|improve this answer



          share|cite|improve this answer










          answered Jan 25 at 4:08









          MetricMetric

          1,25159




          1,25159












          • $begingroup$
            That's why I'm using the sign function in my proof.
            $endgroup$
            – sequence
            Jan 25 at 4:27










          • $begingroup$
            Yes, so you're not using a constant, as the sgn applied to $r^{n+1}$ will alternate.
            $endgroup$
            – Metric
            Jan 25 at 4:28








          • 1




            $begingroup$
            Have to sleep now, but in case I wasn't clear in the above, LHS denotes the Left Hand Side of the inequality, and RHS is abbreviated analogously. The main point of the above is that no what what positive constant $c$ you choose, the RHS will be $< 0$ infinitely often, while the LHS is always $ge 0$, so the inequality above can never hold.
            $endgroup$
            – Metric
            Jan 25 at 4:45












          • $begingroup$
            I see now. Thanks for clarifying. @Metric
            $endgroup$
            – sequence
            Jan 25 at 5:25






          • 1




            $begingroup$
            This is not correct; big-oh notation always takes absolute values, so you need not worry about sign changes.
            $endgroup$
            – Clayton
            Jan 25 at 13:06


















          • $begingroup$
            That's why I'm using the sign function in my proof.
            $endgroup$
            – sequence
            Jan 25 at 4:27










          • $begingroup$
            Yes, so you're not using a constant, as the sgn applied to $r^{n+1}$ will alternate.
            $endgroup$
            – Metric
            Jan 25 at 4:28








          • 1




            $begingroup$
            Have to sleep now, but in case I wasn't clear in the above, LHS denotes the Left Hand Side of the inequality, and RHS is abbreviated analogously. The main point of the above is that no what what positive constant $c$ you choose, the RHS will be $< 0$ infinitely often, while the LHS is always $ge 0$, so the inequality above can never hold.
            $endgroup$
            – Metric
            Jan 25 at 4:45












          • $begingroup$
            I see now. Thanks for clarifying. @Metric
            $endgroup$
            – sequence
            Jan 25 at 5:25






          • 1




            $begingroup$
            This is not correct; big-oh notation always takes absolute values, so you need not worry about sign changes.
            $endgroup$
            – Clayton
            Jan 25 at 13:06
















          $begingroup$
          That's why I'm using the sign function in my proof.
          $endgroup$
          – sequence
          Jan 25 at 4:27




          $begingroup$
          That's why I'm using the sign function in my proof.
          $endgroup$
          – sequence
          Jan 25 at 4:27












          $begingroup$
          Yes, so you're not using a constant, as the sgn applied to $r^{n+1}$ will alternate.
          $endgroup$
          – Metric
          Jan 25 at 4:28






          $begingroup$
          Yes, so you're not using a constant, as the sgn applied to $r^{n+1}$ will alternate.
          $endgroup$
          – Metric
          Jan 25 at 4:28






          1




          1




          $begingroup$
          Have to sleep now, but in case I wasn't clear in the above, LHS denotes the Left Hand Side of the inequality, and RHS is abbreviated analogously. The main point of the above is that no what what positive constant $c$ you choose, the RHS will be $< 0$ infinitely often, while the LHS is always $ge 0$, so the inequality above can never hold.
          $endgroup$
          – Metric
          Jan 25 at 4:45






          $begingroup$
          Have to sleep now, but in case I wasn't clear in the above, LHS denotes the Left Hand Side of the inequality, and RHS is abbreviated analogously. The main point of the above is that no what what positive constant $c$ you choose, the RHS will be $< 0$ infinitely often, while the LHS is always $ge 0$, so the inequality above can never hold.
          $endgroup$
          – Metric
          Jan 25 at 4:45














          $begingroup$
          I see now. Thanks for clarifying. @Metric
          $endgroup$
          – sequence
          Jan 25 at 5:25




          $begingroup$
          I see now. Thanks for clarifying. @Metric
          $endgroup$
          – sequence
          Jan 25 at 5:25




          1




          1




          $begingroup$
          This is not correct; big-oh notation always takes absolute values, so you need not worry about sign changes.
          $endgroup$
          – Clayton
          Jan 25 at 13:06




          $begingroup$
          This is not correct; big-oh notation always takes absolute values, so you need not worry about sign changes.
          $endgroup$
          – Clayton
          Jan 25 at 13:06


















          draft saved

          draft discarded




















































          Thanks for contributing an answer to Mathematics Stack Exchange!


          • Please be sure to answer the question. Provide details and share your research!

          But avoid



          • Asking for help, clarification, or responding to other answers.

          • Making statements based on opinion; back them up with references or personal experience.


          Use MathJax to format equations. MathJax reference.


          To learn more, see our tips on writing great answers.




          draft saved


          draft discarded














          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3086671%2fsum-of-n1-power-terms-is-infinite-geometric-series-plus-orn1%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown





















































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown

































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown







          Popular posts from this blog

          Mario Kart Wii

          The Binding of Isaac: Rebirth/Afterbirth

          What does “Dominus providebit” mean?