Several ways to prove that $sumlimits^infty_{n=1}left(1-frac1{sqrt{n}}right)^n$ converges












5














I believe there are several ways to prove that $sumlimits^{infty}_{n=1}left(1-frac{1}{sqrt{n}}right)^n$ converges. Can you, please, post yours so that we can learn from you?



HERE IS ONE



Let $ninBbb{N}$ be fixed such that $a_n=left(1-frac{1}{sqrt{n}}right)^n.$ Then,
begin{align} a_n&=left(1-frac{1}{sqrt{n}}right)^n \&=explnleft(1-frac{1}{sqrt{n}}right)^n\&=exp left[nlnleft(1-frac{1}{sqrt{n}}right)right] \&=expleft[ -nsum^{infty}_{k=1}frac{1}{k}left(frac{1}{sqrt{n}}right)^kright]\&=expleft[ -nleft(frac{1}{sqrt{n}}+frac{1}{2n}+sum^{infty}_{k=3}frac{1}{k}left(frac{1}{sqrt{n}}right)^kright)right]\&=exp left[-sqrt{n}-frac{1}{2}-sum^{infty}_{k=3}frac{n}{k}left(frac{1}{sqrt{n}}right)^kright]\&equivexp left(-sqrt{n}right)exp left(-frac{1}{2}right)end{align}
Choose $b_n=exp left(-sqrt{n}right)$, so that
begin{align} dfrac{a_n}{b_n}toexp left(-frac{1}{2}right).end{align}
Since $b_n to 0$, there exists $N$ such that for all $ngeq N,$
begin{align} exp left(-sqrt{n}right)<dfrac{1}{n^2}.end{align}
Hence, begin{align}sum^{infty}_{n=N}b_n= sum^{infty}_{n=N}exp left(-sqrt{n}right)leq sum^{infty}_{n=N}dfrac{1}{n^2}<infty,end{align}
and so, $sum^{infty}_{n=1}b_n<inftyimplies sum^{infty}_{n=1}a_n<infty$ by Limit comparison test.










share|cite|improve this question




















  • 3




    Other (?) approaches here math.stackexchange.com/q/1716009/42969 and here math.stackexchange.com/q/1558739/42969.
    – Martin R
    Jan 3 at 17:12
















5














I believe there are several ways to prove that $sumlimits^{infty}_{n=1}left(1-frac{1}{sqrt{n}}right)^n$ converges. Can you, please, post yours so that we can learn from you?



HERE IS ONE



Let $ninBbb{N}$ be fixed such that $a_n=left(1-frac{1}{sqrt{n}}right)^n.$ Then,
begin{align} a_n&=left(1-frac{1}{sqrt{n}}right)^n \&=explnleft(1-frac{1}{sqrt{n}}right)^n\&=exp left[nlnleft(1-frac{1}{sqrt{n}}right)right] \&=expleft[ -nsum^{infty}_{k=1}frac{1}{k}left(frac{1}{sqrt{n}}right)^kright]\&=expleft[ -nleft(frac{1}{sqrt{n}}+frac{1}{2n}+sum^{infty}_{k=3}frac{1}{k}left(frac{1}{sqrt{n}}right)^kright)right]\&=exp left[-sqrt{n}-frac{1}{2}-sum^{infty}_{k=3}frac{n}{k}left(frac{1}{sqrt{n}}right)^kright]\&equivexp left(-sqrt{n}right)exp left(-frac{1}{2}right)end{align}
Choose $b_n=exp left(-sqrt{n}right)$, so that
begin{align} dfrac{a_n}{b_n}toexp left(-frac{1}{2}right).end{align}
Since $b_n to 0$, there exists $N$ such that for all $ngeq N,$
begin{align} exp left(-sqrt{n}right)<dfrac{1}{n^2}.end{align}
Hence, begin{align}sum^{infty}_{n=N}b_n= sum^{infty}_{n=N}exp left(-sqrt{n}right)leq sum^{infty}_{n=N}dfrac{1}{n^2}<infty,end{align}
and so, $sum^{infty}_{n=1}b_n<inftyimplies sum^{infty}_{n=1}a_n<infty$ by Limit comparison test.










share|cite|improve this question




















  • 3




    Other (?) approaches here math.stackexchange.com/q/1716009/42969 and here math.stackexchange.com/q/1558739/42969.
    – Martin R
    Jan 3 at 17:12














5












5








5


3





I believe there are several ways to prove that $sumlimits^{infty}_{n=1}left(1-frac{1}{sqrt{n}}right)^n$ converges. Can you, please, post yours so that we can learn from you?



HERE IS ONE



Let $ninBbb{N}$ be fixed such that $a_n=left(1-frac{1}{sqrt{n}}right)^n.$ Then,
begin{align} a_n&=left(1-frac{1}{sqrt{n}}right)^n \&=explnleft(1-frac{1}{sqrt{n}}right)^n\&=exp left[nlnleft(1-frac{1}{sqrt{n}}right)right] \&=expleft[ -nsum^{infty}_{k=1}frac{1}{k}left(frac{1}{sqrt{n}}right)^kright]\&=expleft[ -nleft(frac{1}{sqrt{n}}+frac{1}{2n}+sum^{infty}_{k=3}frac{1}{k}left(frac{1}{sqrt{n}}right)^kright)right]\&=exp left[-sqrt{n}-frac{1}{2}-sum^{infty}_{k=3}frac{n}{k}left(frac{1}{sqrt{n}}right)^kright]\&equivexp left(-sqrt{n}right)exp left(-frac{1}{2}right)end{align}
Choose $b_n=exp left(-sqrt{n}right)$, so that
begin{align} dfrac{a_n}{b_n}toexp left(-frac{1}{2}right).end{align}
Since $b_n to 0$, there exists $N$ such that for all $ngeq N,$
begin{align} exp left(-sqrt{n}right)<dfrac{1}{n^2}.end{align}
Hence, begin{align}sum^{infty}_{n=N}b_n= sum^{infty}_{n=N}exp left(-sqrt{n}right)leq sum^{infty}_{n=N}dfrac{1}{n^2}<infty,end{align}
and so, $sum^{infty}_{n=1}b_n<inftyimplies sum^{infty}_{n=1}a_n<infty$ by Limit comparison test.










share|cite|improve this question















I believe there are several ways to prove that $sumlimits^{infty}_{n=1}left(1-frac{1}{sqrt{n}}right)^n$ converges. Can you, please, post yours so that we can learn from you?



HERE IS ONE



Let $ninBbb{N}$ be fixed such that $a_n=left(1-frac{1}{sqrt{n}}right)^n.$ Then,
begin{align} a_n&=left(1-frac{1}{sqrt{n}}right)^n \&=explnleft(1-frac{1}{sqrt{n}}right)^n\&=exp left[nlnleft(1-frac{1}{sqrt{n}}right)right] \&=expleft[ -nsum^{infty}_{k=1}frac{1}{k}left(frac{1}{sqrt{n}}right)^kright]\&=expleft[ -nleft(frac{1}{sqrt{n}}+frac{1}{2n}+sum^{infty}_{k=3}frac{1}{k}left(frac{1}{sqrt{n}}right)^kright)right]\&=exp left[-sqrt{n}-frac{1}{2}-sum^{infty}_{k=3}frac{n}{k}left(frac{1}{sqrt{n}}right)^kright]\&equivexp left(-sqrt{n}right)exp left(-frac{1}{2}right)end{align}
Choose $b_n=exp left(-sqrt{n}right)$, so that
begin{align} dfrac{a_n}{b_n}toexp left(-frac{1}{2}right).end{align}
Since $b_n to 0$, there exists $N$ such that for all $ngeq N,$
begin{align} exp left(-sqrt{n}right)<dfrac{1}{n^2}.end{align}
Hence, begin{align}sum^{infty}_{n=N}b_n= sum^{infty}_{n=N}exp left(-sqrt{n}right)leq sum^{infty}_{n=N}dfrac{1}{n^2}<infty,end{align}
and so, $sum^{infty}_{n=1}b_n<inftyimplies sum^{infty}_{n=1}a_n<infty$ by Limit comparison test.







real-analysis sequences-and-series






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited yesterday









Did

246k23221455




246k23221455










asked Jan 3 at 15:25









Mike

1,506321




1,506321








  • 3




    Other (?) approaches here math.stackexchange.com/q/1716009/42969 and here math.stackexchange.com/q/1558739/42969.
    – Martin R
    Jan 3 at 17:12














  • 3




    Other (?) approaches here math.stackexchange.com/q/1716009/42969 and here math.stackexchange.com/q/1558739/42969.
    – Martin R
    Jan 3 at 17:12








3




3




Other (?) approaches here math.stackexchange.com/q/1716009/42969 and here math.stackexchange.com/q/1558739/42969.
– Martin R
Jan 3 at 17:12




Other (?) approaches here math.stackexchange.com/q/1716009/42969 and here math.stackexchange.com/q/1558739/42969.
– Martin R
Jan 3 at 17:12










3 Answers
3






active

oldest

votes


















2














$$
frac{left(,1-frac{1}{sqrt{n}},right)^n}{e^{-sqrt{n}}}
= frac{left(,left(,1-frac{1}{sqrt{n}},right)^{sqrt{n}},right)^{sqrt{n}}}{left(,e^{-1},right)^{sqrt{n}}} to 1
$$

So $sum_{n=1}^infty left(,1-frac{1}{sqrt{n}},right)^n$ and $sum_{n=1}^{infty} e^{-sqrt{n}}$ converge or diverge together by the limit comparison test. Given that
$$int_1^infty e^{-sqrt{t}};dt
= left[, -2e^{-sqrt{x}}(sqrt{x}+1),right]_1^infty
= frac{4}{e} < infty$$

we conclude that both of the latter series then converge by the integral test.






share|cite|improve this answer





















  • We can also observe than for $nge 2$ we have $log (1-1/sqrt n,)<-1/sqrt n,,$ so $(1-1/sqrt n)^n<e^{-sqrt n}$.
    – DanielWainfleet
    yesterday












  • That's so true! They both converge or diverge together! (+1)
    – Mike
    yesterday



















2














Here is an elementary approach, which has the advantage of being based, first, on a basic inequality which is so useful that one should keep it in mind anyway, second, on a condensation technique which is so useful that one should keep it in mind anyway, and third, on a standard series which is so useful that one should keep it in mind anyway as well...



A basic inequality: For every $x$,




$$1-xleqslant e^{-x}tag{1}$$




(This stems, for example, from the fact that, the exponential function being convex, its graph is above its tangent at $x=0$.)



Now, we massage slightly the Uhr inequality $(1)$ above: if both sides are nonnegative, the inequality is preserved when raised to any positive power, hence, for every $xleqslant1$ and every nonnegative $n$, $$(1-x)^nleqslant e^{-nx}$$ for example, for every positive $n$, $$left(1-frac1{sqrt n}right)^nleqslant e^{-sqrt n}$$ hence the series of interest converges as soon as the series $$sum_ne^{-sqrt n}$$ converges.



A condensation technique: In words, we slice our series, the $k$th slice going from $n=k^2$ to $n=(k+1)^2-1$. Then, every term in slice $k$ is at most $e^{-k}$ and slice $k$ has $2k+1$ terms hence




$$sum_{n=1}^infty e^{-sqrt n}leqslantsum_{k=1}^infty (2k+1)e^{-k}tag{2}$$




and all that remains to be shown is that the series in the RHS of $(2)$ converges.



A standard series: Perhaps the most useful series of all is the geometric series, namely the fact that, for every $|x|<1$,




$$sum_{k=1}^infty x^k=frac1{1-x}tag{3}$$




(Several simple proofs of $(3)$ exist, perhaps you already know some of them.)



Taking this for granted, note that the RHS of $(2)$ is almost a geometric series. To complete the comparison, we differentiate the geometric series term by term on $|x|<1$ (yes, this is legit), yielding $$sum_{k=1}^infty kx^{k-1}=frac1{(1-x)^2}$$
We shall only keep a small part of this result, namely the fact that the series $$sum_{k=1}^infty x^kqquadtext{and}qquadsum_{k=1}^infty kx^k$$ both converge for every $|x|<1$.



Cauda: In particular, for $x=e^{-1}$, the series in the RHS of $(2)$ converges, qed.






share|cite|improve this answer























  • This is unsual, (+1)
    – Mike
    yesterday






  • 1




    @Mike Thanks. But actually all of this is ultra standard.
    – Did
    yesterday










  • I really thought so! Thanks once again!
    – Mike
    yesterday



















1














As you have mentioned $$exp(-sqrt n)=left({1over e}right)^{sqrt n}<{1over n^2}$$ also for any $0<a<1$ we have $$a=left({1over e}right)^{k}$$where $k=-ln a>0$ therefore by substitution $$a^{sqrt n}=left({1over e}right)^{ksqrt n}=left({1over e}right)^{sqrt {nk^2}}<{1over n^2cdot k^4}$$for large enough $n$. Based on this and on $$lim_{ntoinfty}left(1-{1over sqrt n}right)^{sqrt n}={1over e}$$we can for small enough $epsilon>0$ and large enough $n$ write that $$0<{{1over e}-epsilon<left(1-{1over sqrt n}right)^{sqrt n}<{1over e}+epsilon}<1$$therefore $$0<left({1over e}-epsilonright)^{sqrt n}<left(1-{1over sqrt n}right)^{n}<left({1over e}+epsilonright)^{sqrt n}<1$$since both $sum_{n=1}^{infty}left({1over e}+epsilonright)^{sqrt n}$ and $sum_{n=1}^{infty}left({1over e}-epsilonright)^{sqrt n}$ are convergent then so is $$sum_{n=1}^{infty}left(1-{1over sqrt n}right)^{n}$$






share|cite|improve this answer





















  • I love this one! (+1)
    – Mike
    yesterday










  • That's very kind of you $(+1)$ too :)
    – Mostafa Ayaz
    yesterday












  • You are most welcome!
    – Mike
    yesterday











Your Answer





StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");

StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














draft saved

draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3060671%2fseveral-ways-to-prove-that-sum-limits-infty-n-1-left1-frac1-sqrtn-rig%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown

























3 Answers
3






active

oldest

votes








3 Answers
3






active

oldest

votes









active

oldest

votes






active

oldest

votes









2














$$
frac{left(,1-frac{1}{sqrt{n}},right)^n}{e^{-sqrt{n}}}
= frac{left(,left(,1-frac{1}{sqrt{n}},right)^{sqrt{n}},right)^{sqrt{n}}}{left(,e^{-1},right)^{sqrt{n}}} to 1
$$

So $sum_{n=1}^infty left(,1-frac{1}{sqrt{n}},right)^n$ and $sum_{n=1}^{infty} e^{-sqrt{n}}$ converge or diverge together by the limit comparison test. Given that
$$int_1^infty e^{-sqrt{t}};dt
= left[, -2e^{-sqrt{x}}(sqrt{x}+1),right]_1^infty
= frac{4}{e} < infty$$

we conclude that both of the latter series then converge by the integral test.






share|cite|improve this answer





















  • We can also observe than for $nge 2$ we have $log (1-1/sqrt n,)<-1/sqrt n,,$ so $(1-1/sqrt n)^n<e^{-sqrt n}$.
    – DanielWainfleet
    yesterday












  • That's so true! They both converge or diverge together! (+1)
    – Mike
    yesterday
















2














$$
frac{left(,1-frac{1}{sqrt{n}},right)^n}{e^{-sqrt{n}}}
= frac{left(,left(,1-frac{1}{sqrt{n}},right)^{sqrt{n}},right)^{sqrt{n}}}{left(,e^{-1},right)^{sqrt{n}}} to 1
$$

So $sum_{n=1}^infty left(,1-frac{1}{sqrt{n}},right)^n$ and $sum_{n=1}^{infty} e^{-sqrt{n}}$ converge or diverge together by the limit comparison test. Given that
$$int_1^infty e^{-sqrt{t}};dt
= left[, -2e^{-sqrt{x}}(sqrt{x}+1),right]_1^infty
= frac{4}{e} < infty$$

we conclude that both of the latter series then converge by the integral test.






share|cite|improve this answer





















  • We can also observe than for $nge 2$ we have $log (1-1/sqrt n,)<-1/sqrt n,,$ so $(1-1/sqrt n)^n<e^{-sqrt n}$.
    – DanielWainfleet
    yesterday












  • That's so true! They both converge or diverge together! (+1)
    – Mike
    yesterday














2












2








2






$$
frac{left(,1-frac{1}{sqrt{n}},right)^n}{e^{-sqrt{n}}}
= frac{left(,left(,1-frac{1}{sqrt{n}},right)^{sqrt{n}},right)^{sqrt{n}}}{left(,e^{-1},right)^{sqrt{n}}} to 1
$$

So $sum_{n=1}^infty left(,1-frac{1}{sqrt{n}},right)^n$ and $sum_{n=1}^{infty} e^{-sqrt{n}}$ converge or diverge together by the limit comparison test. Given that
$$int_1^infty e^{-sqrt{t}};dt
= left[, -2e^{-sqrt{x}}(sqrt{x}+1),right]_1^infty
= frac{4}{e} < infty$$

we conclude that both of the latter series then converge by the integral test.






share|cite|improve this answer












$$
frac{left(,1-frac{1}{sqrt{n}},right)^n}{e^{-sqrt{n}}}
= frac{left(,left(,1-frac{1}{sqrt{n}},right)^{sqrt{n}},right)^{sqrt{n}}}{left(,e^{-1},right)^{sqrt{n}}} to 1
$$

So $sum_{n=1}^infty left(,1-frac{1}{sqrt{n}},right)^n$ and $sum_{n=1}^{infty} e^{-sqrt{n}}$ converge or diverge together by the limit comparison test. Given that
$$int_1^infty e^{-sqrt{t}};dt
= left[, -2e^{-sqrt{x}}(sqrt{x}+1),right]_1^infty
= frac{4}{e} < infty$$

we conclude that both of the latter series then converge by the integral test.







share|cite|improve this answer












share|cite|improve this answer



share|cite|improve this answer










answered yesterday









adfriedman

3,141169




3,141169












  • We can also observe than for $nge 2$ we have $log (1-1/sqrt n,)<-1/sqrt n,,$ so $(1-1/sqrt n)^n<e^{-sqrt n}$.
    – DanielWainfleet
    yesterday












  • That's so true! They both converge or diverge together! (+1)
    – Mike
    yesterday


















  • We can also observe than for $nge 2$ we have $log (1-1/sqrt n,)<-1/sqrt n,,$ so $(1-1/sqrt n)^n<e^{-sqrt n}$.
    – DanielWainfleet
    yesterday












  • That's so true! They both converge or diverge together! (+1)
    – Mike
    yesterday
















We can also observe than for $nge 2$ we have $log (1-1/sqrt n,)<-1/sqrt n,,$ so $(1-1/sqrt n)^n<e^{-sqrt n}$.
– DanielWainfleet
yesterday






We can also observe than for $nge 2$ we have $log (1-1/sqrt n,)<-1/sqrt n,,$ so $(1-1/sqrt n)^n<e^{-sqrt n}$.
– DanielWainfleet
yesterday














That's so true! They both converge or diverge together! (+1)
– Mike
yesterday




That's so true! They both converge or diverge together! (+1)
– Mike
yesterday











2














Here is an elementary approach, which has the advantage of being based, first, on a basic inequality which is so useful that one should keep it in mind anyway, second, on a condensation technique which is so useful that one should keep it in mind anyway, and third, on a standard series which is so useful that one should keep it in mind anyway as well...



A basic inequality: For every $x$,




$$1-xleqslant e^{-x}tag{1}$$




(This stems, for example, from the fact that, the exponential function being convex, its graph is above its tangent at $x=0$.)



Now, we massage slightly the Uhr inequality $(1)$ above: if both sides are nonnegative, the inequality is preserved when raised to any positive power, hence, for every $xleqslant1$ and every nonnegative $n$, $$(1-x)^nleqslant e^{-nx}$$ for example, for every positive $n$, $$left(1-frac1{sqrt n}right)^nleqslant e^{-sqrt n}$$ hence the series of interest converges as soon as the series $$sum_ne^{-sqrt n}$$ converges.



A condensation technique: In words, we slice our series, the $k$th slice going from $n=k^2$ to $n=(k+1)^2-1$. Then, every term in slice $k$ is at most $e^{-k}$ and slice $k$ has $2k+1$ terms hence




$$sum_{n=1}^infty e^{-sqrt n}leqslantsum_{k=1}^infty (2k+1)e^{-k}tag{2}$$




and all that remains to be shown is that the series in the RHS of $(2)$ converges.



A standard series: Perhaps the most useful series of all is the geometric series, namely the fact that, for every $|x|<1$,




$$sum_{k=1}^infty x^k=frac1{1-x}tag{3}$$




(Several simple proofs of $(3)$ exist, perhaps you already know some of them.)



Taking this for granted, note that the RHS of $(2)$ is almost a geometric series. To complete the comparison, we differentiate the geometric series term by term on $|x|<1$ (yes, this is legit), yielding $$sum_{k=1}^infty kx^{k-1}=frac1{(1-x)^2}$$
We shall only keep a small part of this result, namely the fact that the series $$sum_{k=1}^infty x^kqquadtext{and}qquadsum_{k=1}^infty kx^k$$ both converge for every $|x|<1$.



Cauda: In particular, for $x=e^{-1}$, the series in the RHS of $(2)$ converges, qed.






share|cite|improve this answer























  • This is unsual, (+1)
    – Mike
    yesterday






  • 1




    @Mike Thanks. But actually all of this is ultra standard.
    – Did
    yesterday










  • I really thought so! Thanks once again!
    – Mike
    yesterday
















2














Here is an elementary approach, which has the advantage of being based, first, on a basic inequality which is so useful that one should keep it in mind anyway, second, on a condensation technique which is so useful that one should keep it in mind anyway, and third, on a standard series which is so useful that one should keep it in mind anyway as well...



A basic inequality: For every $x$,




$$1-xleqslant e^{-x}tag{1}$$




(This stems, for example, from the fact that, the exponential function being convex, its graph is above its tangent at $x=0$.)



Now, we massage slightly the Uhr inequality $(1)$ above: if both sides are nonnegative, the inequality is preserved when raised to any positive power, hence, for every $xleqslant1$ and every nonnegative $n$, $$(1-x)^nleqslant e^{-nx}$$ for example, for every positive $n$, $$left(1-frac1{sqrt n}right)^nleqslant e^{-sqrt n}$$ hence the series of interest converges as soon as the series $$sum_ne^{-sqrt n}$$ converges.



A condensation technique: In words, we slice our series, the $k$th slice going from $n=k^2$ to $n=(k+1)^2-1$. Then, every term in slice $k$ is at most $e^{-k}$ and slice $k$ has $2k+1$ terms hence




$$sum_{n=1}^infty e^{-sqrt n}leqslantsum_{k=1}^infty (2k+1)e^{-k}tag{2}$$




and all that remains to be shown is that the series in the RHS of $(2)$ converges.



A standard series: Perhaps the most useful series of all is the geometric series, namely the fact that, for every $|x|<1$,




$$sum_{k=1}^infty x^k=frac1{1-x}tag{3}$$




(Several simple proofs of $(3)$ exist, perhaps you already know some of them.)



Taking this for granted, note that the RHS of $(2)$ is almost a geometric series. To complete the comparison, we differentiate the geometric series term by term on $|x|<1$ (yes, this is legit), yielding $$sum_{k=1}^infty kx^{k-1}=frac1{(1-x)^2}$$
We shall only keep a small part of this result, namely the fact that the series $$sum_{k=1}^infty x^kqquadtext{and}qquadsum_{k=1}^infty kx^k$$ both converge for every $|x|<1$.



Cauda: In particular, for $x=e^{-1}$, the series in the RHS of $(2)$ converges, qed.






share|cite|improve this answer























  • This is unsual, (+1)
    – Mike
    yesterday






  • 1




    @Mike Thanks. But actually all of this is ultra standard.
    – Did
    yesterday










  • I really thought so! Thanks once again!
    – Mike
    yesterday














2












2








2






Here is an elementary approach, which has the advantage of being based, first, on a basic inequality which is so useful that one should keep it in mind anyway, second, on a condensation technique which is so useful that one should keep it in mind anyway, and third, on a standard series which is so useful that one should keep it in mind anyway as well...



A basic inequality: For every $x$,




$$1-xleqslant e^{-x}tag{1}$$




(This stems, for example, from the fact that, the exponential function being convex, its graph is above its tangent at $x=0$.)



Now, we massage slightly the Uhr inequality $(1)$ above: if both sides are nonnegative, the inequality is preserved when raised to any positive power, hence, for every $xleqslant1$ and every nonnegative $n$, $$(1-x)^nleqslant e^{-nx}$$ for example, for every positive $n$, $$left(1-frac1{sqrt n}right)^nleqslant e^{-sqrt n}$$ hence the series of interest converges as soon as the series $$sum_ne^{-sqrt n}$$ converges.



A condensation technique: In words, we slice our series, the $k$th slice going from $n=k^2$ to $n=(k+1)^2-1$. Then, every term in slice $k$ is at most $e^{-k}$ and slice $k$ has $2k+1$ terms hence




$$sum_{n=1}^infty e^{-sqrt n}leqslantsum_{k=1}^infty (2k+1)e^{-k}tag{2}$$




and all that remains to be shown is that the series in the RHS of $(2)$ converges.



A standard series: Perhaps the most useful series of all is the geometric series, namely the fact that, for every $|x|<1$,




$$sum_{k=1}^infty x^k=frac1{1-x}tag{3}$$




(Several simple proofs of $(3)$ exist, perhaps you already know some of them.)



Taking this for granted, note that the RHS of $(2)$ is almost a geometric series. To complete the comparison, we differentiate the geometric series term by term on $|x|<1$ (yes, this is legit), yielding $$sum_{k=1}^infty kx^{k-1}=frac1{(1-x)^2}$$
We shall only keep a small part of this result, namely the fact that the series $$sum_{k=1}^infty x^kqquadtext{and}qquadsum_{k=1}^infty kx^k$$ both converge for every $|x|<1$.



Cauda: In particular, for $x=e^{-1}$, the series in the RHS of $(2)$ converges, qed.






share|cite|improve this answer














Here is an elementary approach, which has the advantage of being based, first, on a basic inequality which is so useful that one should keep it in mind anyway, second, on a condensation technique which is so useful that one should keep it in mind anyway, and third, on a standard series which is so useful that one should keep it in mind anyway as well...



A basic inequality: For every $x$,




$$1-xleqslant e^{-x}tag{1}$$




(This stems, for example, from the fact that, the exponential function being convex, its graph is above its tangent at $x=0$.)



Now, we massage slightly the Uhr inequality $(1)$ above: if both sides are nonnegative, the inequality is preserved when raised to any positive power, hence, for every $xleqslant1$ and every nonnegative $n$, $$(1-x)^nleqslant e^{-nx}$$ for example, for every positive $n$, $$left(1-frac1{sqrt n}right)^nleqslant e^{-sqrt n}$$ hence the series of interest converges as soon as the series $$sum_ne^{-sqrt n}$$ converges.



A condensation technique: In words, we slice our series, the $k$th slice going from $n=k^2$ to $n=(k+1)^2-1$. Then, every term in slice $k$ is at most $e^{-k}$ and slice $k$ has $2k+1$ terms hence




$$sum_{n=1}^infty e^{-sqrt n}leqslantsum_{k=1}^infty (2k+1)e^{-k}tag{2}$$




and all that remains to be shown is that the series in the RHS of $(2)$ converges.



A standard series: Perhaps the most useful series of all is the geometric series, namely the fact that, for every $|x|<1$,




$$sum_{k=1}^infty x^k=frac1{1-x}tag{3}$$




(Several simple proofs of $(3)$ exist, perhaps you already know some of them.)



Taking this for granted, note that the RHS of $(2)$ is almost a geometric series. To complete the comparison, we differentiate the geometric series term by term on $|x|<1$ (yes, this is legit), yielding $$sum_{k=1}^infty kx^{k-1}=frac1{(1-x)^2}$$
We shall only keep a small part of this result, namely the fact that the series $$sum_{k=1}^infty x^kqquadtext{and}qquadsum_{k=1}^infty kx^k$$ both converge for every $|x|<1$.



Cauda: In particular, for $x=e^{-1}$, the series in the RHS of $(2)$ converges, qed.







share|cite|improve this answer














share|cite|improve this answer



share|cite|improve this answer








edited yesterday

























answered yesterday









Did

246k23221455




246k23221455












  • This is unsual, (+1)
    – Mike
    yesterday






  • 1




    @Mike Thanks. But actually all of this is ultra standard.
    – Did
    yesterday










  • I really thought so! Thanks once again!
    – Mike
    yesterday


















  • This is unsual, (+1)
    – Mike
    yesterday






  • 1




    @Mike Thanks. But actually all of this is ultra standard.
    – Did
    yesterday










  • I really thought so! Thanks once again!
    – Mike
    yesterday
















This is unsual, (+1)
– Mike
yesterday




This is unsual, (+1)
– Mike
yesterday




1




1




@Mike Thanks. But actually all of this is ultra standard.
– Did
yesterday




@Mike Thanks. But actually all of this is ultra standard.
– Did
yesterday












I really thought so! Thanks once again!
– Mike
yesterday




I really thought so! Thanks once again!
– Mike
yesterday











1














As you have mentioned $$exp(-sqrt n)=left({1over e}right)^{sqrt n}<{1over n^2}$$ also for any $0<a<1$ we have $$a=left({1over e}right)^{k}$$where $k=-ln a>0$ therefore by substitution $$a^{sqrt n}=left({1over e}right)^{ksqrt n}=left({1over e}right)^{sqrt {nk^2}}<{1over n^2cdot k^4}$$for large enough $n$. Based on this and on $$lim_{ntoinfty}left(1-{1over sqrt n}right)^{sqrt n}={1over e}$$we can for small enough $epsilon>0$ and large enough $n$ write that $$0<{{1over e}-epsilon<left(1-{1over sqrt n}right)^{sqrt n}<{1over e}+epsilon}<1$$therefore $$0<left({1over e}-epsilonright)^{sqrt n}<left(1-{1over sqrt n}right)^{n}<left({1over e}+epsilonright)^{sqrt n}<1$$since both $sum_{n=1}^{infty}left({1over e}+epsilonright)^{sqrt n}$ and $sum_{n=1}^{infty}left({1over e}-epsilonright)^{sqrt n}$ are convergent then so is $$sum_{n=1}^{infty}left(1-{1over sqrt n}right)^{n}$$






share|cite|improve this answer





















  • I love this one! (+1)
    – Mike
    yesterday










  • That's very kind of you $(+1)$ too :)
    – Mostafa Ayaz
    yesterday












  • You are most welcome!
    – Mike
    yesterday
















1














As you have mentioned $$exp(-sqrt n)=left({1over e}right)^{sqrt n}<{1over n^2}$$ also for any $0<a<1$ we have $$a=left({1over e}right)^{k}$$where $k=-ln a>0$ therefore by substitution $$a^{sqrt n}=left({1over e}right)^{ksqrt n}=left({1over e}right)^{sqrt {nk^2}}<{1over n^2cdot k^4}$$for large enough $n$. Based on this and on $$lim_{ntoinfty}left(1-{1over sqrt n}right)^{sqrt n}={1over e}$$we can for small enough $epsilon>0$ and large enough $n$ write that $$0<{{1over e}-epsilon<left(1-{1over sqrt n}right)^{sqrt n}<{1over e}+epsilon}<1$$therefore $$0<left({1over e}-epsilonright)^{sqrt n}<left(1-{1over sqrt n}right)^{n}<left({1over e}+epsilonright)^{sqrt n}<1$$since both $sum_{n=1}^{infty}left({1over e}+epsilonright)^{sqrt n}$ and $sum_{n=1}^{infty}left({1over e}-epsilonright)^{sqrt n}$ are convergent then so is $$sum_{n=1}^{infty}left(1-{1over sqrt n}right)^{n}$$






share|cite|improve this answer





















  • I love this one! (+1)
    – Mike
    yesterday










  • That's very kind of you $(+1)$ too :)
    – Mostafa Ayaz
    yesterday












  • You are most welcome!
    – Mike
    yesterday














1












1








1






As you have mentioned $$exp(-sqrt n)=left({1over e}right)^{sqrt n}<{1over n^2}$$ also for any $0<a<1$ we have $$a=left({1over e}right)^{k}$$where $k=-ln a>0$ therefore by substitution $$a^{sqrt n}=left({1over e}right)^{ksqrt n}=left({1over e}right)^{sqrt {nk^2}}<{1over n^2cdot k^4}$$for large enough $n$. Based on this and on $$lim_{ntoinfty}left(1-{1over sqrt n}right)^{sqrt n}={1over e}$$we can for small enough $epsilon>0$ and large enough $n$ write that $$0<{{1over e}-epsilon<left(1-{1over sqrt n}right)^{sqrt n}<{1over e}+epsilon}<1$$therefore $$0<left({1over e}-epsilonright)^{sqrt n}<left(1-{1over sqrt n}right)^{n}<left({1over e}+epsilonright)^{sqrt n}<1$$since both $sum_{n=1}^{infty}left({1over e}+epsilonright)^{sqrt n}$ and $sum_{n=1}^{infty}left({1over e}-epsilonright)^{sqrt n}$ are convergent then so is $$sum_{n=1}^{infty}left(1-{1over sqrt n}right)^{n}$$






share|cite|improve this answer












As you have mentioned $$exp(-sqrt n)=left({1over e}right)^{sqrt n}<{1over n^2}$$ also for any $0<a<1$ we have $$a=left({1over e}right)^{k}$$where $k=-ln a>0$ therefore by substitution $$a^{sqrt n}=left({1over e}right)^{ksqrt n}=left({1over e}right)^{sqrt {nk^2}}<{1over n^2cdot k^4}$$for large enough $n$. Based on this and on $$lim_{ntoinfty}left(1-{1over sqrt n}right)^{sqrt n}={1over e}$$we can for small enough $epsilon>0$ and large enough $n$ write that $$0<{{1over e}-epsilon<left(1-{1over sqrt n}right)^{sqrt n}<{1over e}+epsilon}<1$$therefore $$0<left({1over e}-epsilonright)^{sqrt n}<left(1-{1over sqrt n}right)^{n}<left({1over e}+epsilonright)^{sqrt n}<1$$since both $sum_{n=1}^{infty}left({1over e}+epsilonright)^{sqrt n}$ and $sum_{n=1}^{infty}left({1over e}-epsilonright)^{sqrt n}$ are convergent then so is $$sum_{n=1}^{infty}left(1-{1over sqrt n}right)^{n}$$







share|cite|improve this answer












share|cite|improve this answer



share|cite|improve this answer










answered yesterday









Mostafa Ayaz

14.1k3937




14.1k3937












  • I love this one! (+1)
    – Mike
    yesterday










  • That's very kind of you $(+1)$ too :)
    – Mostafa Ayaz
    yesterday












  • You are most welcome!
    – Mike
    yesterday


















  • I love this one! (+1)
    – Mike
    yesterday










  • That's very kind of you $(+1)$ too :)
    – Mostafa Ayaz
    yesterday












  • You are most welcome!
    – Mike
    yesterday
















I love this one! (+1)
– Mike
yesterday




I love this one! (+1)
– Mike
yesterday












That's very kind of you $(+1)$ too :)
– Mostafa Ayaz
yesterday






That's very kind of you $(+1)$ too :)
– Mostafa Ayaz
yesterday














You are most welcome!
– Mike
yesterday




You are most welcome!
– Mike
yesterday


















draft saved

draft discarded




















































Thanks for contributing an answer to Mathematics Stack Exchange!


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


Use MathJax to format equations. MathJax reference.


To learn more, see our tips on writing great answers.





Some of your past answers have not been well-received, and you're in danger of being blocked from answering.


Please pay close attention to the following guidance:


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3060671%2fseveral-ways-to-prove-that-sum-limits-infty-n-1-left1-frac1-sqrtn-rig%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

Mario Kart Wii

What does “Dominus providebit” mean?

Antonio Litta Visconti Arese