About $lim left(1+frac {x}{n}right)^n$












25












$begingroup$


I was wondering if it is possible to get a link to a rigorous proof that
$$displaystyle lim_{ntoinfty} left(1+frac {x}{n}right)^n=exp x$$










share|cite|improve this question











$endgroup$








  • 6




    $begingroup$
    Well often this is taken as the definition of exp(x), so I suppose it depends on your definition.
    $endgroup$
    – Three
    Apr 11 '13 at 22:43






  • 3




    $begingroup$
    @LordSoth Consider $xmapsto 0$.
    $endgroup$
    – Git Gud
    Apr 11 '13 at 22:46








  • 1




    $begingroup$
    @LordSoth, actually that's false. $exp(x)$ was originally discovered by a Bernoulli as the limit of compound interest -- in fact, exactly as the OP has written it. Only later was the calculus studied: en.wikipedia.org/wiki/Exponential_function
    $endgroup$
    – Three
    Apr 11 '13 at 22:56






  • 1




    $begingroup$
    @Three I suggest you read www-history.mcs.st-and.ac.uk/HistTopics/e.html
    $endgroup$
    – Lord Soth
    Apr 11 '13 at 22:59






  • 3




    $begingroup$
    How do you define $exp$? This is really a matter of definition. What tools do you have available? Can you use continuity of $exp$? Can you use $log$? &c... Whenever you make this kind of questions, you must state what definitions and available tools are, always. Else we're just guessing what you want.
    $endgroup$
    – Pedro Tamaroff
    Apr 11 '13 at 23:56


















25












$begingroup$


I was wondering if it is possible to get a link to a rigorous proof that
$$displaystyle lim_{ntoinfty} left(1+frac {x}{n}right)^n=exp x$$










share|cite|improve this question











$endgroup$








  • 6




    $begingroup$
    Well often this is taken as the definition of exp(x), so I suppose it depends on your definition.
    $endgroup$
    – Three
    Apr 11 '13 at 22:43






  • 3




    $begingroup$
    @LordSoth Consider $xmapsto 0$.
    $endgroup$
    – Git Gud
    Apr 11 '13 at 22:46








  • 1




    $begingroup$
    @LordSoth, actually that's false. $exp(x)$ was originally discovered by a Bernoulli as the limit of compound interest -- in fact, exactly as the OP has written it. Only later was the calculus studied: en.wikipedia.org/wiki/Exponential_function
    $endgroup$
    – Three
    Apr 11 '13 at 22:56






  • 1




    $begingroup$
    @Three I suggest you read www-history.mcs.st-and.ac.uk/HistTopics/e.html
    $endgroup$
    – Lord Soth
    Apr 11 '13 at 22:59






  • 3




    $begingroup$
    How do you define $exp$? This is really a matter of definition. What tools do you have available? Can you use continuity of $exp$? Can you use $log$? &c... Whenever you make this kind of questions, you must state what definitions and available tools are, always. Else we're just guessing what you want.
    $endgroup$
    – Pedro Tamaroff
    Apr 11 '13 at 23:56
















25












25








25


21



$begingroup$


I was wondering if it is possible to get a link to a rigorous proof that
$$displaystyle lim_{ntoinfty} left(1+frac {x}{n}right)^n=exp x$$










share|cite|improve this question











$endgroup$




I was wondering if it is possible to get a link to a rigorous proof that
$$displaystyle lim_{ntoinfty} left(1+frac {x}{n}right)^n=exp x$$







limits exponential-function faq






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Nov 11 '17 at 14:04









Jack

27.2k1781198




27.2k1781198










asked Apr 11 '13 at 22:40









Mai09elMai09el

1991310




1991310








  • 6




    $begingroup$
    Well often this is taken as the definition of exp(x), so I suppose it depends on your definition.
    $endgroup$
    – Three
    Apr 11 '13 at 22:43






  • 3




    $begingroup$
    @LordSoth Consider $xmapsto 0$.
    $endgroup$
    – Git Gud
    Apr 11 '13 at 22:46








  • 1




    $begingroup$
    @LordSoth, actually that's false. $exp(x)$ was originally discovered by a Bernoulli as the limit of compound interest -- in fact, exactly as the OP has written it. Only later was the calculus studied: en.wikipedia.org/wiki/Exponential_function
    $endgroup$
    – Three
    Apr 11 '13 at 22:56






  • 1




    $begingroup$
    @Three I suggest you read www-history.mcs.st-and.ac.uk/HistTopics/e.html
    $endgroup$
    – Lord Soth
    Apr 11 '13 at 22:59






  • 3




    $begingroup$
    How do you define $exp$? This is really a matter of definition. What tools do you have available? Can you use continuity of $exp$? Can you use $log$? &c... Whenever you make this kind of questions, you must state what definitions and available tools are, always. Else we're just guessing what you want.
    $endgroup$
    – Pedro Tamaroff
    Apr 11 '13 at 23:56
















  • 6




    $begingroup$
    Well often this is taken as the definition of exp(x), so I suppose it depends on your definition.
    $endgroup$
    – Three
    Apr 11 '13 at 22:43






  • 3




    $begingroup$
    @LordSoth Consider $xmapsto 0$.
    $endgroup$
    – Git Gud
    Apr 11 '13 at 22:46








  • 1




    $begingroup$
    @LordSoth, actually that's false. $exp(x)$ was originally discovered by a Bernoulli as the limit of compound interest -- in fact, exactly as the OP has written it. Only later was the calculus studied: en.wikipedia.org/wiki/Exponential_function
    $endgroup$
    – Three
    Apr 11 '13 at 22:56






  • 1




    $begingroup$
    @Three I suggest you read www-history.mcs.st-and.ac.uk/HistTopics/e.html
    $endgroup$
    – Lord Soth
    Apr 11 '13 at 22:59






  • 3




    $begingroup$
    How do you define $exp$? This is really a matter of definition. What tools do you have available? Can you use continuity of $exp$? Can you use $log$? &c... Whenever you make this kind of questions, you must state what definitions and available tools are, always. Else we're just guessing what you want.
    $endgroup$
    – Pedro Tamaroff
    Apr 11 '13 at 23:56










6




6




$begingroup$
Well often this is taken as the definition of exp(x), so I suppose it depends on your definition.
$endgroup$
– Three
Apr 11 '13 at 22:43




$begingroup$
Well often this is taken as the definition of exp(x), so I suppose it depends on your definition.
$endgroup$
– Three
Apr 11 '13 at 22:43




3




3




$begingroup$
@LordSoth Consider $xmapsto 0$.
$endgroup$
– Git Gud
Apr 11 '13 at 22:46






$begingroup$
@LordSoth Consider $xmapsto 0$.
$endgroup$
– Git Gud
Apr 11 '13 at 22:46






1




1




$begingroup$
@LordSoth, actually that's false. $exp(x)$ was originally discovered by a Bernoulli as the limit of compound interest -- in fact, exactly as the OP has written it. Only later was the calculus studied: en.wikipedia.org/wiki/Exponential_function
$endgroup$
– Three
Apr 11 '13 at 22:56




$begingroup$
@LordSoth, actually that's false. $exp(x)$ was originally discovered by a Bernoulli as the limit of compound interest -- in fact, exactly as the OP has written it. Only later was the calculus studied: en.wikipedia.org/wiki/Exponential_function
$endgroup$
– Three
Apr 11 '13 at 22:56




1




1




$begingroup$
@Three I suggest you read www-history.mcs.st-and.ac.uk/HistTopics/e.html
$endgroup$
– Lord Soth
Apr 11 '13 at 22:59




$begingroup$
@Three I suggest you read www-history.mcs.st-and.ac.uk/HistTopics/e.html
$endgroup$
– Lord Soth
Apr 11 '13 at 22:59




3




3




$begingroup$
How do you define $exp$? This is really a matter of definition. What tools do you have available? Can you use continuity of $exp$? Can you use $log$? &c... Whenever you make this kind of questions, you must state what definitions and available tools are, always. Else we're just guessing what you want.
$endgroup$
– Pedro Tamaroff
Apr 11 '13 at 23:56






$begingroup$
How do you define $exp$? This is really a matter of definition. What tools do you have available? Can you use continuity of $exp$? Can you use $log$? &c... Whenever you make this kind of questions, you must state what definitions and available tools are, always. Else we're just guessing what you want.
$endgroup$
– Pedro Tamaroff
Apr 11 '13 at 23:56












9 Answers
9






active

oldest

votes


















21












$begingroup$

From the very definition (one of many, I know):



$$e:=lim_{ntoinfty}left(1+frac{1}{n}right)^n$$



we can try the following, depending on what you have read so far in this subject:



(1) Deduce that



$$e=lim_{ntoinfty}left(1+frac{1}{f(n)}right)^{f(n)};,;;text{as long as};;f(n)xrightarrow[ntoinfty]{}infty$$



and then from here ($,xneq0,$ , but this is only a light technicality)



$$left(1+frac{x}{n}right)^n=left[;left(1+frac{1}{frac{n}{x}}right)^frac{n}{x};right]^xxrightarrow[ntoinfty]{}e^x$$



2) For $,x>0,$ , substitute $,mx=n,$ . Note that $,ntoinftyimplies mtoinfty,$ , and



$$left(1+frac{x}{n}right)^n=left(left(1+frac{1}{m}right)^mright)^xxrightarrow[ntoinftyiff mtoinfty]{}e^x$$



I'll leave it to you to work out the case $,x<0,$ (hint: arithmetic of limits and "going" to denominators)






share|cite|improve this answer









$endgroup$





















    16












    $begingroup$

    I would like to cite here an awesome German mathematician, Konrad Königsberger. He writes in his book ,,Analysis I'' as follows:




    Fundamentallemma. For every sequence of complex numbers $w_n$ with a limit $w$ it is true that $$lim_{n to infty} Bigl(1 + frac{w_n}{n}Bigr)^n = sum_{k=0}^infty frac{w^k}{k!}.$$ Proof. For every $varepsilon > 0$ and sufficiently large index $K$ we have the following estimations: $$sum_{k=K}^infty frac{(|w|+1)^k}{k!} < frac varepsilon 3 quadmbox{and}quad |w_n| le |w|+1.$$Therefore if $n ge K$ then $$left|Bigl(1 + frac{w_n}{n}Big)^n - exp w right| le sum_{k=0}^{K-1} left|{n choose k}frac{w_n^k}{n^k} - frac{w^k}{k!}right| + sum_{k=K}^n{nchoose k} frac{|w_n|^k}{n^k} + sum_{k=K}^infty frac{|w|^k}{k!}.$$ The third sum is smaller than $varepsilon / 3$ based on our assumptions. We can find an upper bound for the middle one using $${n choose k} frac 1 {n^k} = frac{1}{k!} prod_{i = 1}^{k-1} Bigl(1 - frac i n Bigr) le frac 1 {k!}.$$ Combining this with $|w_n| le |w| + 1$, $$sum_{k=K}^n {n choose k} frac{|w_n|^k}{n^k} < sum_{k=K}^n frac{(|w|+1)^k}{k!} < frac varepsilon 3$$ Finally, the first sum converges to $0$ due to $w_n to w$ and ${n choose k} n^{-k} to frac 1 {k!}$. We can choose $N > K$ such that it's smaller than $varepsilon / 3$ as soon as $n > N$.




    Really brilliant.






    share|cite|improve this answer











    $endgroup$









    • 2




      $begingroup$
      I examined proof technique for $w_n = w$ (no sequence) then went full bore. Agree - brilliant! I can use it to show the exp power series takes addition to multiplication. Did not have to get in the weeds with rearrangements, absolute convergence/ commutativity, etc.
      $endgroup$
      – CopyPasteIt
      Jul 9 '17 at 23:10








    • 1




      $begingroup$
      +1.... This appears to be the only complete and rigorous answer ( so far) that considers all complex $w$. And provides a reference (as the proposer requested). And actually has more than what was asked... A masterful exposition by K.K.
      $endgroup$
      – DanielWainfleet
      Aug 30 '18 at 15:49



















    10












    $begingroup$

    Firstly, let us give a definition to the exponential function, so we know the function has various properties:



    $$ exp(x) := sum_{n=0}^{infty} frac{x^n}{n!}$$



    so that we can prove that (as exp is a power series) :




    • The exponential function has radius of convergence $infty$, and is thus defined on all of $mathbb R$

    • As a power series is infinitely differentiable inside its circle of convergence, the exponential function is infinitely differentiable on all of $mathbb R$

    • We can then prove that the function is strictly increasing, and thus by the inverse function theorem (http://en.wikipedia.org/wiki/Inverse_function_theorem) we can define what we know as the "log" function


    Knowing all of this, here is hopefully a sufficiently rigorous proof (at least for positive a):



    As $log(x)$ is continuous and differentiable on $(0,infty)$, we have that $log(1+x)$ is continuous and differentiable on $[0,frac{a}{n}]$, so by the mean value theorem we know there exists a $c in [0,frac{a}{n}]$ with



    $$f'(c) = frac {log(1+ frac{a}{n} ) - log(1)} {frac {a}{n} - 0 } $$
    $$ Longrightarrow log[{(1+frac{a}{n})^n}] = frac{a}{1+c}$$
    $$ Longrightarrow (1+frac{a}{n})^n = exp({frac{a}{1+c}})$$



    for some $c in [0,frac{a}{n}]$ . As we then want to take the limit as $n rightarrow infty$, we get that:




    • As $c in [0,frac{a}{n}]$ and $frac{a}{n} rightarrow 0$ as $n rightarrow infty$, by the squeeze theorem we get that $ c rightarrow 0$ as $n rightarrow infty$

    • As $ c rightarrow 0$ as $n rightarrow infty$, $frac{a}{1+c} rightarrow a$ as $n rightarrow infty$

    • As the exponential function is continuous on $mathbb R$, the limit can pass inside the function, so we get that as $frac{a}{1+c} rightarrow a$ as $n rightarrow infty$


    $$ exp(frac{a}{1+c}) rightarrow exp(a) $$
    as $n rightarrow infty$. Thus we can conclude that



    $$ lim_{n to infty} (1+frac{a}{n})^n = e^a$$



    (Of course, this is ignoring that one needs to prove that $exp(a)=e^a$, but this is hardly vital for this question)






    share|cite|improve this answer











    $endgroup$













    • $begingroup$
      If we're just about to define the exponential function (or at least show that it equals something), it seems to me the assumption of its continuity is highly suspicious...
      $endgroup$
      – DonAntonio
      Apr 11 '13 at 23:36










    • $begingroup$
      This is true - although I can't see how this proof is nothing more than showing that the various definitions of the exponential function are equilivant, and thus I would presume continuity would have been proved before trying to prove statements such as this one (for example, in our lectures we defined it in terms of a power series, which means that we can prove it is continuous fairly straightforwardly)
      $endgroup$
      – Andrew D
      Apr 11 '13 at 23:39










    • $begingroup$
      I agree with that, @Andrew D, but then perhaps mentioning some other definition from which continuity follows and then use that in it...perhaps too long a detour for a beginner, but absolutely possible indeed.
      $endgroup$
      – DonAntonio
      Apr 11 '13 at 23:42












    • $begingroup$
      @DonAntonio The log's continuity assumption is just fine, though. Since $exp$ is its inverse, it is continuous.
      $endgroup$
      – Pedro Tamaroff
      Apr 11 '13 at 23:50










    • $begingroup$
      Yeah, thankfully that is covered by the inverse function theorem (which I've now linked/discussed above, along with some other things)
      $endgroup$
      – Andrew D
      Apr 11 '13 at 23:51



















    5












    $begingroup$


    Aaah... The sweet sound of silent revenge downvotes... Always a pleasure!




    Consider the functions $u$ and $v$ defined for every $|t|ltfrac12$ by
    $$
    u(t)=t-log(1+t),qquad v(t)=t-t^2-log(1+t).
    $$
    The derivative of $u$ is $u'(t)=frac{t}{1+t}$, which has the sign of $t$, hence $u(t)geqslant0$. The derivative of $v$ is $v'(t)=1-2t-frac{1}{1+t}$, which has the sign of $(1+t)(1-2t)-1=-t(1+2t)$ which has the sign of $-t$ on the domain $|t|ltfrac12$ hence $v(t)leqslant0$.
    Thus:




    For every $|t|ltfrac12$,
    $$
    t-t^2leqslantlog (1+t)leqslant t.
    $$




    The function $zmapstoexp(nz)$ is nondecreasing on the same domain hence
    $$
    expleft(nt-nt^2right)leqslant(1+t)^nleqslantexpleft(ntright).
    $$
    In particular, using this for $t=x/n$, one gets:




    For every $|x|<frac12n$,
    $$
    expleft(x-frac{x^2}{n}right)leqslantleft(1+frac{x}nright)^nleqslantmathrm e^x.
    $$




    Finally, $x^2/nto 0$ when $ntoinfty$ and the exponential is continuous at $0$, hence we are done.



    Facts/Definitions used:




    • The logarithm has derivative $tmapsto1/t$.

    • The exponential is the inverse of the logarithm.






    share|cite|improve this answer











    $endgroup$













    • $begingroup$
      We need to evangelize the use of $leqslant$ and $geqslant$ in MSE.
      $endgroup$
      – Pedro Tamaroff
      Aug 10 '13 at 4:11












    • $begingroup$
      I used this in an application to lower bound $(1+x/n)^n$, thank you.
      $endgroup$
      – JP McCarthy
      Aug 16 '16 at 11:55










    • $begingroup$
      Didier, I like this approach. So (+1). I was wondering if you've seen a way to establish the same lower bound for $left(1+frac xnright)^n$ by using the limit definition of the exponential function and without appealing to calculus. The upper bound is trivial for $x>-n$.
      $endgroup$
      – Mark Viola
      Jan 5 '17 at 19:39












    • $begingroup$
      @Dr.MV This reduces to showing $1+tgeqslant exp(t-t^2)$, that is, $frac1{1+t}leqslantexp(-t+t^2)$. What you call the trivial upper bound yields $frac1{1+t}=1-frac{t}{1+t}leqslantexpleft(-frac{t}{1+t}right)$ hence if $frac{t}{1+t}geqslant t-t^2$, we are done. This is asking that $tgeqslant (t-t^2)(1+t)=t(1-t^2)$ hence indeed, we are done for every $t$ in $(0,1)$. This fails for $t$ negative but similar arguments might work.
      $endgroup$
      – Did
      Jan 8 '17 at 9:17












    • $begingroup$
      Upvote for the revenge thing.
      $endgroup$
      – Math_QED
      Nov 21 '18 at 22:02



















    3












    $begingroup$

    Another answer, assuming $x>0$:



    Let $f(x)=ln(x)$. Then we know that $f'(x)=1/x$. Also, by the definition of derivative, we can write
    $$
    begin{align}
    f'(x)&=lim_{hto 0}frac{f(x+h)-f(x)}{h}\
    &=lim_{hto 0}frac{ln(x+h)-ln(x)}{h}\
    &=lim_{hto 0}frac{1}{h}lnfrac{x+h}{x}\
    &=lim_{h to 0}lnleft(frac{x+h}{x}right)^frac{1}{h}\
    &=lim_{hto 0}lnleft(1+frac{h}{x}right)^frac{1}{h}
    end{align}
    $$

    Then, using the fact that $ln(x)$ is a continuous function for all $x$ in its domain, we can exchange the $lim$ and $ln$:
    $$
    f'(x)=lnlim_{hto 0}left(1+frac{h}{x}right)^frac{1}{h}
    $$

    Now, let $m=1/h$. Then $mtoinfty$ as $hto 0^+$, and
    $$
    f'(x)=lnlim_{mtoinfty}left(1+frac{1}{mx}right)^m
    $$

    Now, assuming $x>0$, define $n=mx^2$, and so $ntoinfty$ as $mtoinfty$. Then we can write
    $$
    f'(x)=lnlim_{ntoinfty}left[left(1+frac{x}{n}right)^nright]^{1/x^2}
    $$

    and from before, we still have $f'(x)=1/x$, so
    $$
    lnlim_{ntoinfty}left[left(1+frac{x}{n}right)^nright]^{1/x^2}=frac{1}{x}
    $$

    Exponentiating both sides, we find
    $$
    lim_{ntoinfty}left[left(1+frac{x}{n}right)^nright]^{1/x^2}=e^{1/x}
    $$

    Finally, raising both sides to the $x^2$, we find
    $$
    lim_{ntoinfty}left(1+frac{x}{n}right)^n=e^x
    $$

    EDIT: This idea actually works for all reals—if we use $f(x)=ln|x|$ instead, then we get eventually get:
    $$
    e^x=lim_{ntoinfty}left|1+frac{x}{n}right|^{n}=lim_{ntoinfty}left(1+frac{x}{n}right)^n
    $$

    Where the last equality come from the fact that $n$ always eventually dominates $x$, so that the absolute value function becomes redundant.



    This leaves the case where $x=0$, but that is a trivial matter.






    share|cite|improve this answer











    $endgroup$





















      1












      $begingroup$

      $ (1+x/n)^n = sum_{k=0}^n binom{n}{k}frac{x^k}{n^k} $



      Now just prove that $binom{n}{k}frac{x^k}{n^k}$ approaches $frac{x^k}{k!}$ as n approaches infinity, and you will have proven that your limit matches the Taylor series for $exp(x)$






      share|cite|improve this answer











      $endgroup$









      • 5




        $begingroup$
        This is not enough; there are infinitely many terms, so you need to show that you can swap two limits here.
        $endgroup$
        – Qiaochu Yuan
        Apr 11 '13 at 23:17






      • 1




        $begingroup$
        What you want to do is work with $limsup$ and $liminf$ here, and show $e^xleqliminf $ and $e^xgeq limsup$
        $endgroup$
        – Pedro Tamaroff
        Apr 11 '13 at 23:53










      • $begingroup$
        How would you show that you can swap the two limits?
        $endgroup$
        – amarney
        Mar 26 '17 at 22:54



















      1












      $begingroup$

      For any fixed value of $x$, define



      $$f(u)= {ln(1+ux)over u}$$



      By L'Hopital's Rule,



      $$lim_{urightarrow0^+}f(u)=lim_{urightarrow0^+}{x/(1+ux)over1}=x$$



      Now exponentiate $f$:



      $$e^{f(u)}=(1+ux)^{1/u}$$



      By continuity of the exponential function, we have



      $$lim_{urightarrow0^+}(1+ux)^{1/u}=lim_{urightarrow0^+}e^{f(u)}=e^{lim_{urightarrow0^+}f(u)}=e^x$$



      All these limits have been shown to exist for the (positive) real variable $u$ tending to $0$, hence they must exist, and be the same, for the sequence of reciprocals of integers, $u=1/n$, as $n$ tends to infinity, and the result follows:



      $$lim_{nrightarrowinfty}left(1+{xover n}right)^n = e^x$$






      share|cite|improve this answer









      $endgroup$





















        0












        $begingroup$

        This one of the ways in which it is defined. The equivalence of the definitions can be proved easily, I guess.
        If for example you take the exponential function to be the inverse of the logarithm:



        $log(lim_n(1 + frac{x}{n})^n) = lim_n n log(1 + frac{x}{n}) = lim_n n cdot[frac{x}{n} - frac{x^2}{2n^2} + dots] = x$



        EDIT: The logarithm is defined as usual: $log x = int_1^x frac{dt}{t}$. The first identity follows from the continuity of the logarithm, the second it's just an application of one of the property of the logarithm ($log a^b = b log a $), while to obtain the third it sufficies to have the Taylor expansion of $log(1+x)$.






        share|cite|improve this answer











        $endgroup$













        • $begingroup$
          The very first equality requires, me believes, a justification that I cannot see as very easy unless we already assume quite a bit (say, continuity...). After that things get even tougher as we need power series and then also, apparently, differentiation.
          $endgroup$
          – DonAntonio
          Apr 11 '13 at 23:40






        • 2




          $begingroup$
          The logarithm is defined as $int_1^x frac{dt}{t}$, therefore, if we have integration we can also have continuity and differentiation, I suppose.
          $endgroup$
          – user67133
          Apr 11 '13 at 23:45












        • $begingroup$
          Perhaps so and also perhaps mentioning this could clear things out a little, since we don't know, apparently, what the OP's background is.
          $endgroup$
          – DonAntonio
          Apr 11 '13 at 23:47






        • 1




          $begingroup$
          I cannot but totally agree. Thank you for your suggestions, I am going to edit the post to make it clearer!
          $endgroup$
          – user67133
          Apr 12 '13 at 0:07



















        0












        $begingroup$

        There is at most one function $g$ on $mathbb{R}$ such that
        $$g'(x)=g(x)text{ for all } xtext{ in }mathbb{R}quadtext{and}quad g(0)=1,.$$
        If you let $f_n(x)=(1+x/n)^n$ and you can demonstrate that it compactly converges to some function $f$, you can demonstrate that $f'(x)=f(x)$ and $f(0)=1$. Likewise, if you take $f_n(x)=sum_{k=0}^n x^k/k!$ and demonstrate this sequence converges compactly, you can show that this limit satisfies the same conditions. Thus it doesn't matter what your definition is. The uniqueness criteria is what you should probably have in mind when you think of "the exponential".






        share|cite|improve this answer









        $endgroup$












          protected by user99914 Nov 11 '17 at 4:03



          Thank you for your interest in this question.
          Because it has attracted low-quality or spam answers that had to be removed, posting an answer now requires 10 reputation on this site (the association bonus does not count).



          Would you like to answer one of these unanswered questions instead?














          9 Answers
          9






          active

          oldest

          votes








          9 Answers
          9






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes









          21












          $begingroup$

          From the very definition (one of many, I know):



          $$e:=lim_{ntoinfty}left(1+frac{1}{n}right)^n$$



          we can try the following, depending on what you have read so far in this subject:



          (1) Deduce that



          $$e=lim_{ntoinfty}left(1+frac{1}{f(n)}right)^{f(n)};,;;text{as long as};;f(n)xrightarrow[ntoinfty]{}infty$$



          and then from here ($,xneq0,$ , but this is only a light technicality)



          $$left(1+frac{x}{n}right)^n=left[;left(1+frac{1}{frac{n}{x}}right)^frac{n}{x};right]^xxrightarrow[ntoinfty]{}e^x$$



          2) For $,x>0,$ , substitute $,mx=n,$ . Note that $,ntoinftyimplies mtoinfty,$ , and



          $$left(1+frac{x}{n}right)^n=left(left(1+frac{1}{m}right)^mright)^xxrightarrow[ntoinftyiff mtoinfty]{}e^x$$



          I'll leave it to you to work out the case $,x<0,$ (hint: arithmetic of limits and "going" to denominators)






          share|cite|improve this answer









          $endgroup$


















            21












            $begingroup$

            From the very definition (one of many, I know):



            $$e:=lim_{ntoinfty}left(1+frac{1}{n}right)^n$$



            we can try the following, depending on what you have read so far in this subject:



            (1) Deduce that



            $$e=lim_{ntoinfty}left(1+frac{1}{f(n)}right)^{f(n)};,;;text{as long as};;f(n)xrightarrow[ntoinfty]{}infty$$



            and then from here ($,xneq0,$ , but this is only a light technicality)



            $$left(1+frac{x}{n}right)^n=left[;left(1+frac{1}{frac{n}{x}}right)^frac{n}{x};right]^xxrightarrow[ntoinfty]{}e^x$$



            2) For $,x>0,$ , substitute $,mx=n,$ . Note that $,ntoinftyimplies mtoinfty,$ , and



            $$left(1+frac{x}{n}right)^n=left(left(1+frac{1}{m}right)^mright)^xxrightarrow[ntoinftyiff mtoinfty]{}e^x$$



            I'll leave it to you to work out the case $,x<0,$ (hint: arithmetic of limits and "going" to denominators)






            share|cite|improve this answer









            $endgroup$
















              21












              21








              21





              $begingroup$

              From the very definition (one of many, I know):



              $$e:=lim_{ntoinfty}left(1+frac{1}{n}right)^n$$



              we can try the following, depending on what you have read so far in this subject:



              (1) Deduce that



              $$e=lim_{ntoinfty}left(1+frac{1}{f(n)}right)^{f(n)};,;;text{as long as};;f(n)xrightarrow[ntoinfty]{}infty$$



              and then from here ($,xneq0,$ , but this is only a light technicality)



              $$left(1+frac{x}{n}right)^n=left[;left(1+frac{1}{frac{n}{x}}right)^frac{n}{x};right]^xxrightarrow[ntoinfty]{}e^x$$



              2) For $,x>0,$ , substitute $,mx=n,$ . Note that $,ntoinftyimplies mtoinfty,$ , and



              $$left(1+frac{x}{n}right)^n=left(left(1+frac{1}{m}right)^mright)^xxrightarrow[ntoinftyiff mtoinfty]{}e^x$$



              I'll leave it to you to work out the case $,x<0,$ (hint: arithmetic of limits and "going" to denominators)






              share|cite|improve this answer









              $endgroup$



              From the very definition (one of many, I know):



              $$e:=lim_{ntoinfty}left(1+frac{1}{n}right)^n$$



              we can try the following, depending on what you have read so far in this subject:



              (1) Deduce that



              $$e=lim_{ntoinfty}left(1+frac{1}{f(n)}right)^{f(n)};,;;text{as long as};;f(n)xrightarrow[ntoinfty]{}infty$$



              and then from here ($,xneq0,$ , but this is only a light technicality)



              $$left(1+frac{x}{n}right)^n=left[;left(1+frac{1}{frac{n}{x}}right)^frac{n}{x};right]^xxrightarrow[ntoinfty]{}e^x$$



              2) For $,x>0,$ , substitute $,mx=n,$ . Note that $,ntoinftyimplies mtoinfty,$ , and



              $$left(1+frac{x}{n}right)^n=left(left(1+frac{1}{m}right)^mright)^xxrightarrow[ntoinftyiff mtoinfty]{}e^x$$



              I'll leave it to you to work out the case $,x<0,$ (hint: arithmetic of limits and "going" to denominators)







              share|cite|improve this answer












              share|cite|improve this answer



              share|cite|improve this answer










              answered Apr 11 '13 at 23:23









              DonAntonioDonAntonio

              177k1492225




              177k1492225























                  16












                  $begingroup$

                  I would like to cite here an awesome German mathematician, Konrad Königsberger. He writes in his book ,,Analysis I'' as follows:




                  Fundamentallemma. For every sequence of complex numbers $w_n$ with a limit $w$ it is true that $$lim_{n to infty} Bigl(1 + frac{w_n}{n}Bigr)^n = sum_{k=0}^infty frac{w^k}{k!}.$$ Proof. For every $varepsilon > 0$ and sufficiently large index $K$ we have the following estimations: $$sum_{k=K}^infty frac{(|w|+1)^k}{k!} < frac varepsilon 3 quadmbox{and}quad |w_n| le |w|+1.$$Therefore if $n ge K$ then $$left|Bigl(1 + frac{w_n}{n}Big)^n - exp w right| le sum_{k=0}^{K-1} left|{n choose k}frac{w_n^k}{n^k} - frac{w^k}{k!}right| + sum_{k=K}^n{nchoose k} frac{|w_n|^k}{n^k} + sum_{k=K}^infty frac{|w|^k}{k!}.$$ The third sum is smaller than $varepsilon / 3$ based on our assumptions. We can find an upper bound for the middle one using $${n choose k} frac 1 {n^k} = frac{1}{k!} prod_{i = 1}^{k-1} Bigl(1 - frac i n Bigr) le frac 1 {k!}.$$ Combining this with $|w_n| le |w| + 1$, $$sum_{k=K}^n {n choose k} frac{|w_n|^k}{n^k} < sum_{k=K}^n frac{(|w|+1)^k}{k!} < frac varepsilon 3$$ Finally, the first sum converges to $0$ due to $w_n to w$ and ${n choose k} n^{-k} to frac 1 {k!}$. We can choose $N > K$ such that it's smaller than $varepsilon / 3$ as soon as $n > N$.




                  Really brilliant.






                  share|cite|improve this answer











                  $endgroup$









                  • 2




                    $begingroup$
                    I examined proof technique for $w_n = w$ (no sequence) then went full bore. Agree - brilliant! I can use it to show the exp power series takes addition to multiplication. Did not have to get in the weeds with rearrangements, absolute convergence/ commutativity, etc.
                    $endgroup$
                    – CopyPasteIt
                    Jul 9 '17 at 23:10








                  • 1




                    $begingroup$
                    +1.... This appears to be the only complete and rigorous answer ( so far) that considers all complex $w$. And provides a reference (as the proposer requested). And actually has more than what was asked... A masterful exposition by K.K.
                    $endgroup$
                    – DanielWainfleet
                    Aug 30 '18 at 15:49
















                  16












                  $begingroup$

                  I would like to cite here an awesome German mathematician, Konrad Königsberger. He writes in his book ,,Analysis I'' as follows:




                  Fundamentallemma. For every sequence of complex numbers $w_n$ with a limit $w$ it is true that $$lim_{n to infty} Bigl(1 + frac{w_n}{n}Bigr)^n = sum_{k=0}^infty frac{w^k}{k!}.$$ Proof. For every $varepsilon > 0$ and sufficiently large index $K$ we have the following estimations: $$sum_{k=K}^infty frac{(|w|+1)^k}{k!} < frac varepsilon 3 quadmbox{and}quad |w_n| le |w|+1.$$Therefore if $n ge K$ then $$left|Bigl(1 + frac{w_n}{n}Big)^n - exp w right| le sum_{k=0}^{K-1} left|{n choose k}frac{w_n^k}{n^k} - frac{w^k}{k!}right| + sum_{k=K}^n{nchoose k} frac{|w_n|^k}{n^k} + sum_{k=K}^infty frac{|w|^k}{k!}.$$ The third sum is smaller than $varepsilon / 3$ based on our assumptions. We can find an upper bound for the middle one using $${n choose k} frac 1 {n^k} = frac{1}{k!} prod_{i = 1}^{k-1} Bigl(1 - frac i n Bigr) le frac 1 {k!}.$$ Combining this with $|w_n| le |w| + 1$, $$sum_{k=K}^n {n choose k} frac{|w_n|^k}{n^k} < sum_{k=K}^n frac{(|w|+1)^k}{k!} < frac varepsilon 3$$ Finally, the first sum converges to $0$ due to $w_n to w$ and ${n choose k} n^{-k} to frac 1 {k!}$. We can choose $N > K$ such that it's smaller than $varepsilon / 3$ as soon as $n > N$.




                  Really brilliant.






                  share|cite|improve this answer











                  $endgroup$









                  • 2




                    $begingroup$
                    I examined proof technique for $w_n = w$ (no sequence) then went full bore. Agree - brilliant! I can use it to show the exp power series takes addition to multiplication. Did not have to get in the weeds with rearrangements, absolute convergence/ commutativity, etc.
                    $endgroup$
                    – CopyPasteIt
                    Jul 9 '17 at 23:10








                  • 1




                    $begingroup$
                    +1.... This appears to be the only complete and rigorous answer ( so far) that considers all complex $w$. And provides a reference (as the proposer requested). And actually has more than what was asked... A masterful exposition by K.K.
                    $endgroup$
                    – DanielWainfleet
                    Aug 30 '18 at 15:49














                  16












                  16








                  16





                  $begingroup$

                  I would like to cite here an awesome German mathematician, Konrad Königsberger. He writes in his book ,,Analysis I'' as follows:




                  Fundamentallemma. For every sequence of complex numbers $w_n$ with a limit $w$ it is true that $$lim_{n to infty} Bigl(1 + frac{w_n}{n}Bigr)^n = sum_{k=0}^infty frac{w^k}{k!}.$$ Proof. For every $varepsilon > 0$ and sufficiently large index $K$ we have the following estimations: $$sum_{k=K}^infty frac{(|w|+1)^k}{k!} < frac varepsilon 3 quadmbox{and}quad |w_n| le |w|+1.$$Therefore if $n ge K$ then $$left|Bigl(1 + frac{w_n}{n}Big)^n - exp w right| le sum_{k=0}^{K-1} left|{n choose k}frac{w_n^k}{n^k} - frac{w^k}{k!}right| + sum_{k=K}^n{nchoose k} frac{|w_n|^k}{n^k} + sum_{k=K}^infty frac{|w|^k}{k!}.$$ The third sum is smaller than $varepsilon / 3$ based on our assumptions. We can find an upper bound for the middle one using $${n choose k} frac 1 {n^k} = frac{1}{k!} prod_{i = 1}^{k-1} Bigl(1 - frac i n Bigr) le frac 1 {k!}.$$ Combining this with $|w_n| le |w| + 1$, $$sum_{k=K}^n {n choose k} frac{|w_n|^k}{n^k} < sum_{k=K}^n frac{(|w|+1)^k}{k!} < frac varepsilon 3$$ Finally, the first sum converges to $0$ due to $w_n to w$ and ${n choose k} n^{-k} to frac 1 {k!}$. We can choose $N > K$ such that it's smaller than $varepsilon / 3$ as soon as $n > N$.




                  Really brilliant.






                  share|cite|improve this answer











                  $endgroup$



                  I would like to cite here an awesome German mathematician, Konrad Königsberger. He writes in his book ,,Analysis I'' as follows:




                  Fundamentallemma. For every sequence of complex numbers $w_n$ with a limit $w$ it is true that $$lim_{n to infty} Bigl(1 + frac{w_n}{n}Bigr)^n = sum_{k=0}^infty frac{w^k}{k!}.$$ Proof. For every $varepsilon > 0$ and sufficiently large index $K$ we have the following estimations: $$sum_{k=K}^infty frac{(|w|+1)^k}{k!} < frac varepsilon 3 quadmbox{and}quad |w_n| le |w|+1.$$Therefore if $n ge K$ then $$left|Bigl(1 + frac{w_n}{n}Big)^n - exp w right| le sum_{k=0}^{K-1} left|{n choose k}frac{w_n^k}{n^k} - frac{w^k}{k!}right| + sum_{k=K}^n{nchoose k} frac{|w_n|^k}{n^k} + sum_{k=K}^infty frac{|w|^k}{k!}.$$ The third sum is smaller than $varepsilon / 3$ based on our assumptions. We can find an upper bound for the middle one using $${n choose k} frac 1 {n^k} = frac{1}{k!} prod_{i = 1}^{k-1} Bigl(1 - frac i n Bigr) le frac 1 {k!}.$$ Combining this with $|w_n| le |w| + 1$, $$sum_{k=K}^n {n choose k} frac{|w_n|^k}{n^k} < sum_{k=K}^n frac{(|w|+1)^k}{k!} < frac varepsilon 3$$ Finally, the first sum converges to $0$ due to $w_n to w$ and ${n choose k} n^{-k} to frac 1 {k!}$. We can choose $N > K$ such that it's smaller than $varepsilon / 3$ as soon as $n > N$.




                  Really brilliant.







                  share|cite|improve this answer














                  share|cite|improve this answer



                  share|cite|improve this answer








                  edited Nov 26 '18 at 8:10









                  Aaron Tyrrell

                  155




                  155










                  answered Aug 20 '16 at 20:07









                  SantiagoSantiago

                  1,027519




                  1,027519








                  • 2




                    $begingroup$
                    I examined proof technique for $w_n = w$ (no sequence) then went full bore. Agree - brilliant! I can use it to show the exp power series takes addition to multiplication. Did not have to get in the weeds with rearrangements, absolute convergence/ commutativity, etc.
                    $endgroup$
                    – CopyPasteIt
                    Jul 9 '17 at 23:10








                  • 1




                    $begingroup$
                    +1.... This appears to be the only complete and rigorous answer ( so far) that considers all complex $w$. And provides a reference (as the proposer requested). And actually has more than what was asked... A masterful exposition by K.K.
                    $endgroup$
                    – DanielWainfleet
                    Aug 30 '18 at 15:49














                  • 2




                    $begingroup$
                    I examined proof technique for $w_n = w$ (no sequence) then went full bore. Agree - brilliant! I can use it to show the exp power series takes addition to multiplication. Did not have to get in the weeds with rearrangements, absolute convergence/ commutativity, etc.
                    $endgroup$
                    – CopyPasteIt
                    Jul 9 '17 at 23:10








                  • 1




                    $begingroup$
                    +1.... This appears to be the only complete and rigorous answer ( so far) that considers all complex $w$. And provides a reference (as the proposer requested). And actually has more than what was asked... A masterful exposition by K.K.
                    $endgroup$
                    – DanielWainfleet
                    Aug 30 '18 at 15:49








                  2




                  2




                  $begingroup$
                  I examined proof technique for $w_n = w$ (no sequence) then went full bore. Agree - brilliant! I can use it to show the exp power series takes addition to multiplication. Did not have to get in the weeds with rearrangements, absolute convergence/ commutativity, etc.
                  $endgroup$
                  – CopyPasteIt
                  Jul 9 '17 at 23:10






                  $begingroup$
                  I examined proof technique for $w_n = w$ (no sequence) then went full bore. Agree - brilliant! I can use it to show the exp power series takes addition to multiplication. Did not have to get in the weeds with rearrangements, absolute convergence/ commutativity, etc.
                  $endgroup$
                  – CopyPasteIt
                  Jul 9 '17 at 23:10






                  1




                  1




                  $begingroup$
                  +1.... This appears to be the only complete and rigorous answer ( so far) that considers all complex $w$. And provides a reference (as the proposer requested). And actually has more than what was asked... A masterful exposition by K.K.
                  $endgroup$
                  – DanielWainfleet
                  Aug 30 '18 at 15:49




                  $begingroup$
                  +1.... This appears to be the only complete and rigorous answer ( so far) that considers all complex $w$. And provides a reference (as the proposer requested). And actually has more than what was asked... A masterful exposition by K.K.
                  $endgroup$
                  – DanielWainfleet
                  Aug 30 '18 at 15:49











                  10












                  $begingroup$

                  Firstly, let us give a definition to the exponential function, so we know the function has various properties:



                  $$ exp(x) := sum_{n=0}^{infty} frac{x^n}{n!}$$



                  so that we can prove that (as exp is a power series) :




                  • The exponential function has radius of convergence $infty$, and is thus defined on all of $mathbb R$

                  • As a power series is infinitely differentiable inside its circle of convergence, the exponential function is infinitely differentiable on all of $mathbb R$

                  • We can then prove that the function is strictly increasing, and thus by the inverse function theorem (http://en.wikipedia.org/wiki/Inverse_function_theorem) we can define what we know as the "log" function


                  Knowing all of this, here is hopefully a sufficiently rigorous proof (at least for positive a):



                  As $log(x)$ is continuous and differentiable on $(0,infty)$, we have that $log(1+x)$ is continuous and differentiable on $[0,frac{a}{n}]$, so by the mean value theorem we know there exists a $c in [0,frac{a}{n}]$ with



                  $$f'(c) = frac {log(1+ frac{a}{n} ) - log(1)} {frac {a}{n} - 0 } $$
                  $$ Longrightarrow log[{(1+frac{a}{n})^n}] = frac{a}{1+c}$$
                  $$ Longrightarrow (1+frac{a}{n})^n = exp({frac{a}{1+c}})$$



                  for some $c in [0,frac{a}{n}]$ . As we then want to take the limit as $n rightarrow infty$, we get that:




                  • As $c in [0,frac{a}{n}]$ and $frac{a}{n} rightarrow 0$ as $n rightarrow infty$, by the squeeze theorem we get that $ c rightarrow 0$ as $n rightarrow infty$

                  • As $ c rightarrow 0$ as $n rightarrow infty$, $frac{a}{1+c} rightarrow a$ as $n rightarrow infty$

                  • As the exponential function is continuous on $mathbb R$, the limit can pass inside the function, so we get that as $frac{a}{1+c} rightarrow a$ as $n rightarrow infty$


                  $$ exp(frac{a}{1+c}) rightarrow exp(a) $$
                  as $n rightarrow infty$. Thus we can conclude that



                  $$ lim_{n to infty} (1+frac{a}{n})^n = e^a$$



                  (Of course, this is ignoring that one needs to prove that $exp(a)=e^a$, but this is hardly vital for this question)






                  share|cite|improve this answer











                  $endgroup$













                  • $begingroup$
                    If we're just about to define the exponential function (or at least show that it equals something), it seems to me the assumption of its continuity is highly suspicious...
                    $endgroup$
                    – DonAntonio
                    Apr 11 '13 at 23:36










                  • $begingroup$
                    This is true - although I can't see how this proof is nothing more than showing that the various definitions of the exponential function are equilivant, and thus I would presume continuity would have been proved before trying to prove statements such as this one (for example, in our lectures we defined it in terms of a power series, which means that we can prove it is continuous fairly straightforwardly)
                    $endgroup$
                    – Andrew D
                    Apr 11 '13 at 23:39










                  • $begingroup$
                    I agree with that, @Andrew D, but then perhaps mentioning some other definition from which continuity follows and then use that in it...perhaps too long a detour for a beginner, but absolutely possible indeed.
                    $endgroup$
                    – DonAntonio
                    Apr 11 '13 at 23:42












                  • $begingroup$
                    @DonAntonio The log's continuity assumption is just fine, though. Since $exp$ is its inverse, it is continuous.
                    $endgroup$
                    – Pedro Tamaroff
                    Apr 11 '13 at 23:50










                  • $begingroup$
                    Yeah, thankfully that is covered by the inverse function theorem (which I've now linked/discussed above, along with some other things)
                    $endgroup$
                    – Andrew D
                    Apr 11 '13 at 23:51
















                  10












                  $begingroup$

                  Firstly, let us give a definition to the exponential function, so we know the function has various properties:



                  $$ exp(x) := sum_{n=0}^{infty} frac{x^n}{n!}$$



                  so that we can prove that (as exp is a power series) :




                  • The exponential function has radius of convergence $infty$, and is thus defined on all of $mathbb R$

                  • As a power series is infinitely differentiable inside its circle of convergence, the exponential function is infinitely differentiable on all of $mathbb R$

                  • We can then prove that the function is strictly increasing, and thus by the inverse function theorem (http://en.wikipedia.org/wiki/Inverse_function_theorem) we can define what we know as the "log" function


                  Knowing all of this, here is hopefully a sufficiently rigorous proof (at least for positive a):



                  As $log(x)$ is continuous and differentiable on $(0,infty)$, we have that $log(1+x)$ is continuous and differentiable on $[0,frac{a}{n}]$, so by the mean value theorem we know there exists a $c in [0,frac{a}{n}]$ with



                  $$f'(c) = frac {log(1+ frac{a}{n} ) - log(1)} {frac {a}{n} - 0 } $$
                  $$ Longrightarrow log[{(1+frac{a}{n})^n}] = frac{a}{1+c}$$
                  $$ Longrightarrow (1+frac{a}{n})^n = exp({frac{a}{1+c}})$$



                  for some $c in [0,frac{a}{n}]$ . As we then want to take the limit as $n rightarrow infty$, we get that:




                  • As $c in [0,frac{a}{n}]$ and $frac{a}{n} rightarrow 0$ as $n rightarrow infty$, by the squeeze theorem we get that $ c rightarrow 0$ as $n rightarrow infty$

                  • As $ c rightarrow 0$ as $n rightarrow infty$, $frac{a}{1+c} rightarrow a$ as $n rightarrow infty$

                  • As the exponential function is continuous on $mathbb R$, the limit can pass inside the function, so we get that as $frac{a}{1+c} rightarrow a$ as $n rightarrow infty$


                  $$ exp(frac{a}{1+c}) rightarrow exp(a) $$
                  as $n rightarrow infty$. Thus we can conclude that



                  $$ lim_{n to infty} (1+frac{a}{n})^n = e^a$$



                  (Of course, this is ignoring that one needs to prove that $exp(a)=e^a$, but this is hardly vital for this question)






                  share|cite|improve this answer











                  $endgroup$













                  • $begingroup$
                    If we're just about to define the exponential function (or at least show that it equals something), it seems to me the assumption of its continuity is highly suspicious...
                    $endgroup$
                    – DonAntonio
                    Apr 11 '13 at 23:36










                  • $begingroup$
                    This is true - although I can't see how this proof is nothing more than showing that the various definitions of the exponential function are equilivant, and thus I would presume continuity would have been proved before trying to prove statements such as this one (for example, in our lectures we defined it in terms of a power series, which means that we can prove it is continuous fairly straightforwardly)
                    $endgroup$
                    – Andrew D
                    Apr 11 '13 at 23:39










                  • $begingroup$
                    I agree with that, @Andrew D, but then perhaps mentioning some other definition from which continuity follows and then use that in it...perhaps too long a detour for a beginner, but absolutely possible indeed.
                    $endgroup$
                    – DonAntonio
                    Apr 11 '13 at 23:42












                  • $begingroup$
                    @DonAntonio The log's continuity assumption is just fine, though. Since $exp$ is its inverse, it is continuous.
                    $endgroup$
                    – Pedro Tamaroff
                    Apr 11 '13 at 23:50










                  • $begingroup$
                    Yeah, thankfully that is covered by the inverse function theorem (which I've now linked/discussed above, along with some other things)
                    $endgroup$
                    – Andrew D
                    Apr 11 '13 at 23:51














                  10












                  10








                  10





                  $begingroup$

                  Firstly, let us give a definition to the exponential function, so we know the function has various properties:



                  $$ exp(x) := sum_{n=0}^{infty} frac{x^n}{n!}$$



                  so that we can prove that (as exp is a power series) :




                  • The exponential function has radius of convergence $infty$, and is thus defined on all of $mathbb R$

                  • As a power series is infinitely differentiable inside its circle of convergence, the exponential function is infinitely differentiable on all of $mathbb R$

                  • We can then prove that the function is strictly increasing, and thus by the inverse function theorem (http://en.wikipedia.org/wiki/Inverse_function_theorem) we can define what we know as the "log" function


                  Knowing all of this, here is hopefully a sufficiently rigorous proof (at least for positive a):



                  As $log(x)$ is continuous and differentiable on $(0,infty)$, we have that $log(1+x)$ is continuous and differentiable on $[0,frac{a}{n}]$, so by the mean value theorem we know there exists a $c in [0,frac{a}{n}]$ with



                  $$f'(c) = frac {log(1+ frac{a}{n} ) - log(1)} {frac {a}{n} - 0 } $$
                  $$ Longrightarrow log[{(1+frac{a}{n})^n}] = frac{a}{1+c}$$
                  $$ Longrightarrow (1+frac{a}{n})^n = exp({frac{a}{1+c}})$$



                  for some $c in [0,frac{a}{n}]$ . As we then want to take the limit as $n rightarrow infty$, we get that:




                  • As $c in [0,frac{a}{n}]$ and $frac{a}{n} rightarrow 0$ as $n rightarrow infty$, by the squeeze theorem we get that $ c rightarrow 0$ as $n rightarrow infty$

                  • As $ c rightarrow 0$ as $n rightarrow infty$, $frac{a}{1+c} rightarrow a$ as $n rightarrow infty$

                  • As the exponential function is continuous on $mathbb R$, the limit can pass inside the function, so we get that as $frac{a}{1+c} rightarrow a$ as $n rightarrow infty$


                  $$ exp(frac{a}{1+c}) rightarrow exp(a) $$
                  as $n rightarrow infty$. Thus we can conclude that



                  $$ lim_{n to infty} (1+frac{a}{n})^n = e^a$$



                  (Of course, this is ignoring that one needs to prove that $exp(a)=e^a$, but this is hardly vital for this question)






                  share|cite|improve this answer











                  $endgroup$



                  Firstly, let us give a definition to the exponential function, so we know the function has various properties:



                  $$ exp(x) := sum_{n=0}^{infty} frac{x^n}{n!}$$



                  so that we can prove that (as exp is a power series) :




                  • The exponential function has radius of convergence $infty$, and is thus defined on all of $mathbb R$

                  • As a power series is infinitely differentiable inside its circle of convergence, the exponential function is infinitely differentiable on all of $mathbb R$

                  • We can then prove that the function is strictly increasing, and thus by the inverse function theorem (http://en.wikipedia.org/wiki/Inverse_function_theorem) we can define what we know as the "log" function


                  Knowing all of this, here is hopefully a sufficiently rigorous proof (at least for positive a):



                  As $log(x)$ is continuous and differentiable on $(0,infty)$, we have that $log(1+x)$ is continuous and differentiable on $[0,frac{a}{n}]$, so by the mean value theorem we know there exists a $c in [0,frac{a}{n}]$ with



                  $$f'(c) = frac {log(1+ frac{a}{n} ) - log(1)} {frac {a}{n} - 0 } $$
                  $$ Longrightarrow log[{(1+frac{a}{n})^n}] = frac{a}{1+c}$$
                  $$ Longrightarrow (1+frac{a}{n})^n = exp({frac{a}{1+c}})$$



                  for some $c in [0,frac{a}{n}]$ . As we then want to take the limit as $n rightarrow infty$, we get that:




                  • As $c in [0,frac{a}{n}]$ and $frac{a}{n} rightarrow 0$ as $n rightarrow infty$, by the squeeze theorem we get that $ c rightarrow 0$ as $n rightarrow infty$

                  • As $ c rightarrow 0$ as $n rightarrow infty$, $frac{a}{1+c} rightarrow a$ as $n rightarrow infty$

                  • As the exponential function is continuous on $mathbb R$, the limit can pass inside the function, so we get that as $frac{a}{1+c} rightarrow a$ as $n rightarrow infty$


                  $$ exp(frac{a}{1+c}) rightarrow exp(a) $$
                  as $n rightarrow infty$. Thus we can conclude that



                  $$ lim_{n to infty} (1+frac{a}{n})^n = e^a$$



                  (Of course, this is ignoring that one needs to prove that $exp(a)=e^a$, but this is hardly vital for this question)







                  share|cite|improve this answer














                  share|cite|improve this answer



                  share|cite|improve this answer








                  edited Apr 11 '13 at 23:56

























                  answered Apr 11 '13 at 23:29









                  Andrew DAndrew D

                  1,776931




                  1,776931












                  • $begingroup$
                    If we're just about to define the exponential function (or at least show that it equals something), it seems to me the assumption of its continuity is highly suspicious...
                    $endgroup$
                    – DonAntonio
                    Apr 11 '13 at 23:36










                  • $begingroup$
                    This is true - although I can't see how this proof is nothing more than showing that the various definitions of the exponential function are equilivant, and thus I would presume continuity would have been proved before trying to prove statements such as this one (for example, in our lectures we defined it in terms of a power series, which means that we can prove it is continuous fairly straightforwardly)
                    $endgroup$
                    – Andrew D
                    Apr 11 '13 at 23:39










                  • $begingroup$
                    I agree with that, @Andrew D, but then perhaps mentioning some other definition from which continuity follows and then use that in it...perhaps too long a detour for a beginner, but absolutely possible indeed.
                    $endgroup$
                    – DonAntonio
                    Apr 11 '13 at 23:42












                  • $begingroup$
                    @DonAntonio The log's continuity assumption is just fine, though. Since $exp$ is its inverse, it is continuous.
                    $endgroup$
                    – Pedro Tamaroff
                    Apr 11 '13 at 23:50










                  • $begingroup$
                    Yeah, thankfully that is covered by the inverse function theorem (which I've now linked/discussed above, along with some other things)
                    $endgroup$
                    – Andrew D
                    Apr 11 '13 at 23:51


















                  • $begingroup$
                    If we're just about to define the exponential function (or at least show that it equals something), it seems to me the assumption of its continuity is highly suspicious...
                    $endgroup$
                    – DonAntonio
                    Apr 11 '13 at 23:36










                  • $begingroup$
                    This is true - although I can't see how this proof is nothing more than showing that the various definitions of the exponential function are equilivant, and thus I would presume continuity would have been proved before trying to prove statements such as this one (for example, in our lectures we defined it in terms of a power series, which means that we can prove it is continuous fairly straightforwardly)
                    $endgroup$
                    – Andrew D
                    Apr 11 '13 at 23:39










                  • $begingroup$
                    I agree with that, @Andrew D, but then perhaps mentioning some other definition from which continuity follows and then use that in it...perhaps too long a detour for a beginner, but absolutely possible indeed.
                    $endgroup$
                    – DonAntonio
                    Apr 11 '13 at 23:42












                  • $begingroup$
                    @DonAntonio The log's continuity assumption is just fine, though. Since $exp$ is its inverse, it is continuous.
                    $endgroup$
                    – Pedro Tamaroff
                    Apr 11 '13 at 23:50










                  • $begingroup$
                    Yeah, thankfully that is covered by the inverse function theorem (which I've now linked/discussed above, along with some other things)
                    $endgroup$
                    – Andrew D
                    Apr 11 '13 at 23:51
















                  $begingroup$
                  If we're just about to define the exponential function (or at least show that it equals something), it seems to me the assumption of its continuity is highly suspicious...
                  $endgroup$
                  – DonAntonio
                  Apr 11 '13 at 23:36




                  $begingroup$
                  If we're just about to define the exponential function (or at least show that it equals something), it seems to me the assumption of its continuity is highly suspicious...
                  $endgroup$
                  – DonAntonio
                  Apr 11 '13 at 23:36












                  $begingroup$
                  This is true - although I can't see how this proof is nothing more than showing that the various definitions of the exponential function are equilivant, and thus I would presume continuity would have been proved before trying to prove statements such as this one (for example, in our lectures we defined it in terms of a power series, which means that we can prove it is continuous fairly straightforwardly)
                  $endgroup$
                  – Andrew D
                  Apr 11 '13 at 23:39




                  $begingroup$
                  This is true - although I can't see how this proof is nothing more than showing that the various definitions of the exponential function are equilivant, and thus I would presume continuity would have been proved before trying to prove statements such as this one (for example, in our lectures we defined it in terms of a power series, which means that we can prove it is continuous fairly straightforwardly)
                  $endgroup$
                  – Andrew D
                  Apr 11 '13 at 23:39












                  $begingroup$
                  I agree with that, @Andrew D, but then perhaps mentioning some other definition from which continuity follows and then use that in it...perhaps too long a detour for a beginner, but absolutely possible indeed.
                  $endgroup$
                  – DonAntonio
                  Apr 11 '13 at 23:42






                  $begingroup$
                  I agree with that, @Andrew D, but then perhaps mentioning some other definition from which continuity follows and then use that in it...perhaps too long a detour for a beginner, but absolutely possible indeed.
                  $endgroup$
                  – DonAntonio
                  Apr 11 '13 at 23:42














                  $begingroup$
                  @DonAntonio The log's continuity assumption is just fine, though. Since $exp$ is its inverse, it is continuous.
                  $endgroup$
                  – Pedro Tamaroff
                  Apr 11 '13 at 23:50




                  $begingroup$
                  @DonAntonio The log's continuity assumption is just fine, though. Since $exp$ is its inverse, it is continuous.
                  $endgroup$
                  – Pedro Tamaroff
                  Apr 11 '13 at 23:50












                  $begingroup$
                  Yeah, thankfully that is covered by the inverse function theorem (which I've now linked/discussed above, along with some other things)
                  $endgroup$
                  – Andrew D
                  Apr 11 '13 at 23:51




                  $begingroup$
                  Yeah, thankfully that is covered by the inverse function theorem (which I've now linked/discussed above, along with some other things)
                  $endgroup$
                  – Andrew D
                  Apr 11 '13 at 23:51











                  5












                  $begingroup$


                  Aaah... The sweet sound of silent revenge downvotes... Always a pleasure!




                  Consider the functions $u$ and $v$ defined for every $|t|ltfrac12$ by
                  $$
                  u(t)=t-log(1+t),qquad v(t)=t-t^2-log(1+t).
                  $$
                  The derivative of $u$ is $u'(t)=frac{t}{1+t}$, which has the sign of $t$, hence $u(t)geqslant0$. The derivative of $v$ is $v'(t)=1-2t-frac{1}{1+t}$, which has the sign of $(1+t)(1-2t)-1=-t(1+2t)$ which has the sign of $-t$ on the domain $|t|ltfrac12$ hence $v(t)leqslant0$.
                  Thus:




                  For every $|t|ltfrac12$,
                  $$
                  t-t^2leqslantlog (1+t)leqslant t.
                  $$




                  The function $zmapstoexp(nz)$ is nondecreasing on the same domain hence
                  $$
                  expleft(nt-nt^2right)leqslant(1+t)^nleqslantexpleft(ntright).
                  $$
                  In particular, using this for $t=x/n$, one gets:




                  For every $|x|<frac12n$,
                  $$
                  expleft(x-frac{x^2}{n}right)leqslantleft(1+frac{x}nright)^nleqslantmathrm e^x.
                  $$




                  Finally, $x^2/nto 0$ when $ntoinfty$ and the exponential is continuous at $0$, hence we are done.



                  Facts/Definitions used:




                  • The logarithm has derivative $tmapsto1/t$.

                  • The exponential is the inverse of the logarithm.






                  share|cite|improve this answer











                  $endgroup$













                  • $begingroup$
                    We need to evangelize the use of $leqslant$ and $geqslant$ in MSE.
                    $endgroup$
                    – Pedro Tamaroff
                    Aug 10 '13 at 4:11












                  • $begingroup$
                    I used this in an application to lower bound $(1+x/n)^n$, thank you.
                    $endgroup$
                    – JP McCarthy
                    Aug 16 '16 at 11:55










                  • $begingroup$
                    Didier, I like this approach. So (+1). I was wondering if you've seen a way to establish the same lower bound for $left(1+frac xnright)^n$ by using the limit definition of the exponential function and without appealing to calculus. The upper bound is trivial for $x>-n$.
                    $endgroup$
                    – Mark Viola
                    Jan 5 '17 at 19:39












                  • $begingroup$
                    @Dr.MV This reduces to showing $1+tgeqslant exp(t-t^2)$, that is, $frac1{1+t}leqslantexp(-t+t^2)$. What you call the trivial upper bound yields $frac1{1+t}=1-frac{t}{1+t}leqslantexpleft(-frac{t}{1+t}right)$ hence if $frac{t}{1+t}geqslant t-t^2$, we are done. This is asking that $tgeqslant (t-t^2)(1+t)=t(1-t^2)$ hence indeed, we are done for every $t$ in $(0,1)$. This fails for $t$ negative but similar arguments might work.
                    $endgroup$
                    – Did
                    Jan 8 '17 at 9:17












                  • $begingroup$
                    Upvote for the revenge thing.
                    $endgroup$
                    – Math_QED
                    Nov 21 '18 at 22:02
















                  5












                  $begingroup$


                  Aaah... The sweet sound of silent revenge downvotes... Always a pleasure!




                  Consider the functions $u$ and $v$ defined for every $|t|ltfrac12$ by
                  $$
                  u(t)=t-log(1+t),qquad v(t)=t-t^2-log(1+t).
                  $$
                  The derivative of $u$ is $u'(t)=frac{t}{1+t}$, which has the sign of $t$, hence $u(t)geqslant0$. The derivative of $v$ is $v'(t)=1-2t-frac{1}{1+t}$, which has the sign of $(1+t)(1-2t)-1=-t(1+2t)$ which has the sign of $-t$ on the domain $|t|ltfrac12$ hence $v(t)leqslant0$.
                  Thus:




                  For every $|t|ltfrac12$,
                  $$
                  t-t^2leqslantlog (1+t)leqslant t.
                  $$




                  The function $zmapstoexp(nz)$ is nondecreasing on the same domain hence
                  $$
                  expleft(nt-nt^2right)leqslant(1+t)^nleqslantexpleft(ntright).
                  $$
                  In particular, using this for $t=x/n$, one gets:




                  For every $|x|<frac12n$,
                  $$
                  expleft(x-frac{x^2}{n}right)leqslantleft(1+frac{x}nright)^nleqslantmathrm e^x.
                  $$




                  Finally, $x^2/nto 0$ when $ntoinfty$ and the exponential is continuous at $0$, hence we are done.



                  Facts/Definitions used:




                  • The logarithm has derivative $tmapsto1/t$.

                  • The exponential is the inverse of the logarithm.






                  share|cite|improve this answer











                  $endgroup$













                  • $begingroup$
                    We need to evangelize the use of $leqslant$ and $geqslant$ in MSE.
                    $endgroup$
                    – Pedro Tamaroff
                    Aug 10 '13 at 4:11












                  • $begingroup$
                    I used this in an application to lower bound $(1+x/n)^n$, thank you.
                    $endgroup$
                    – JP McCarthy
                    Aug 16 '16 at 11:55










                  • $begingroup$
                    Didier, I like this approach. So (+1). I was wondering if you've seen a way to establish the same lower bound for $left(1+frac xnright)^n$ by using the limit definition of the exponential function and without appealing to calculus. The upper bound is trivial for $x>-n$.
                    $endgroup$
                    – Mark Viola
                    Jan 5 '17 at 19:39












                  • $begingroup$
                    @Dr.MV This reduces to showing $1+tgeqslant exp(t-t^2)$, that is, $frac1{1+t}leqslantexp(-t+t^2)$. What you call the trivial upper bound yields $frac1{1+t}=1-frac{t}{1+t}leqslantexpleft(-frac{t}{1+t}right)$ hence if $frac{t}{1+t}geqslant t-t^2$, we are done. This is asking that $tgeqslant (t-t^2)(1+t)=t(1-t^2)$ hence indeed, we are done for every $t$ in $(0,1)$. This fails for $t$ negative but similar arguments might work.
                    $endgroup$
                    – Did
                    Jan 8 '17 at 9:17












                  • $begingroup$
                    Upvote for the revenge thing.
                    $endgroup$
                    – Math_QED
                    Nov 21 '18 at 22:02














                  5












                  5








                  5





                  $begingroup$


                  Aaah... The sweet sound of silent revenge downvotes... Always a pleasure!




                  Consider the functions $u$ and $v$ defined for every $|t|ltfrac12$ by
                  $$
                  u(t)=t-log(1+t),qquad v(t)=t-t^2-log(1+t).
                  $$
                  The derivative of $u$ is $u'(t)=frac{t}{1+t}$, which has the sign of $t$, hence $u(t)geqslant0$. The derivative of $v$ is $v'(t)=1-2t-frac{1}{1+t}$, which has the sign of $(1+t)(1-2t)-1=-t(1+2t)$ which has the sign of $-t$ on the domain $|t|ltfrac12$ hence $v(t)leqslant0$.
                  Thus:




                  For every $|t|ltfrac12$,
                  $$
                  t-t^2leqslantlog (1+t)leqslant t.
                  $$




                  The function $zmapstoexp(nz)$ is nondecreasing on the same domain hence
                  $$
                  expleft(nt-nt^2right)leqslant(1+t)^nleqslantexpleft(ntright).
                  $$
                  In particular, using this for $t=x/n$, one gets:




                  For every $|x|<frac12n$,
                  $$
                  expleft(x-frac{x^2}{n}right)leqslantleft(1+frac{x}nright)^nleqslantmathrm e^x.
                  $$




                  Finally, $x^2/nto 0$ when $ntoinfty$ and the exponential is continuous at $0$, hence we are done.



                  Facts/Definitions used:




                  • The logarithm has derivative $tmapsto1/t$.

                  • The exponential is the inverse of the logarithm.






                  share|cite|improve this answer











                  $endgroup$




                  Aaah... The sweet sound of silent revenge downvotes... Always a pleasure!




                  Consider the functions $u$ and $v$ defined for every $|t|ltfrac12$ by
                  $$
                  u(t)=t-log(1+t),qquad v(t)=t-t^2-log(1+t).
                  $$
                  The derivative of $u$ is $u'(t)=frac{t}{1+t}$, which has the sign of $t$, hence $u(t)geqslant0$. The derivative of $v$ is $v'(t)=1-2t-frac{1}{1+t}$, which has the sign of $(1+t)(1-2t)-1=-t(1+2t)$ which has the sign of $-t$ on the domain $|t|ltfrac12$ hence $v(t)leqslant0$.
                  Thus:




                  For every $|t|ltfrac12$,
                  $$
                  t-t^2leqslantlog (1+t)leqslant t.
                  $$




                  The function $zmapstoexp(nz)$ is nondecreasing on the same domain hence
                  $$
                  expleft(nt-nt^2right)leqslant(1+t)^nleqslantexpleft(ntright).
                  $$
                  In particular, using this for $t=x/n$, one gets:




                  For every $|x|<frac12n$,
                  $$
                  expleft(x-frac{x^2}{n}right)leqslantleft(1+frac{x}nright)^nleqslantmathrm e^x.
                  $$




                  Finally, $x^2/nto 0$ when $ntoinfty$ and the exponential is continuous at $0$, hence we are done.



                  Facts/Definitions used:




                  • The logarithm has derivative $tmapsto1/t$.

                  • The exponential is the inverse of the logarithm.







                  share|cite|improve this answer














                  share|cite|improve this answer



                  share|cite|improve this answer








                  edited Aug 30 '18 at 7:11

























                  answered Apr 12 '13 at 13:48









                  DidDid

                  246k23221456




                  246k23221456












                  • $begingroup$
                    We need to evangelize the use of $leqslant$ and $geqslant$ in MSE.
                    $endgroup$
                    – Pedro Tamaroff
                    Aug 10 '13 at 4:11












                  • $begingroup$
                    I used this in an application to lower bound $(1+x/n)^n$, thank you.
                    $endgroup$
                    – JP McCarthy
                    Aug 16 '16 at 11:55










                  • $begingroup$
                    Didier, I like this approach. So (+1). I was wondering if you've seen a way to establish the same lower bound for $left(1+frac xnright)^n$ by using the limit definition of the exponential function and without appealing to calculus. The upper bound is trivial for $x>-n$.
                    $endgroup$
                    – Mark Viola
                    Jan 5 '17 at 19:39












                  • $begingroup$
                    @Dr.MV This reduces to showing $1+tgeqslant exp(t-t^2)$, that is, $frac1{1+t}leqslantexp(-t+t^2)$. What you call the trivial upper bound yields $frac1{1+t}=1-frac{t}{1+t}leqslantexpleft(-frac{t}{1+t}right)$ hence if $frac{t}{1+t}geqslant t-t^2$, we are done. This is asking that $tgeqslant (t-t^2)(1+t)=t(1-t^2)$ hence indeed, we are done for every $t$ in $(0,1)$. This fails for $t$ negative but similar arguments might work.
                    $endgroup$
                    – Did
                    Jan 8 '17 at 9:17












                  • $begingroup$
                    Upvote for the revenge thing.
                    $endgroup$
                    – Math_QED
                    Nov 21 '18 at 22:02


















                  • $begingroup$
                    We need to evangelize the use of $leqslant$ and $geqslant$ in MSE.
                    $endgroup$
                    – Pedro Tamaroff
                    Aug 10 '13 at 4:11












                  • $begingroup$
                    I used this in an application to lower bound $(1+x/n)^n$, thank you.
                    $endgroup$
                    – JP McCarthy
                    Aug 16 '16 at 11:55










                  • $begingroup$
                    Didier, I like this approach. So (+1). I was wondering if you've seen a way to establish the same lower bound for $left(1+frac xnright)^n$ by using the limit definition of the exponential function and without appealing to calculus. The upper bound is trivial for $x>-n$.
                    $endgroup$
                    – Mark Viola
                    Jan 5 '17 at 19:39












                  • $begingroup$
                    @Dr.MV This reduces to showing $1+tgeqslant exp(t-t^2)$, that is, $frac1{1+t}leqslantexp(-t+t^2)$. What you call the trivial upper bound yields $frac1{1+t}=1-frac{t}{1+t}leqslantexpleft(-frac{t}{1+t}right)$ hence if $frac{t}{1+t}geqslant t-t^2$, we are done. This is asking that $tgeqslant (t-t^2)(1+t)=t(1-t^2)$ hence indeed, we are done for every $t$ in $(0,1)$. This fails for $t$ negative but similar arguments might work.
                    $endgroup$
                    – Did
                    Jan 8 '17 at 9:17












                  • $begingroup$
                    Upvote for the revenge thing.
                    $endgroup$
                    – Math_QED
                    Nov 21 '18 at 22:02
















                  $begingroup$
                  We need to evangelize the use of $leqslant$ and $geqslant$ in MSE.
                  $endgroup$
                  – Pedro Tamaroff
                  Aug 10 '13 at 4:11






                  $begingroup$
                  We need to evangelize the use of $leqslant$ and $geqslant$ in MSE.
                  $endgroup$
                  – Pedro Tamaroff
                  Aug 10 '13 at 4:11














                  $begingroup$
                  I used this in an application to lower bound $(1+x/n)^n$, thank you.
                  $endgroup$
                  – JP McCarthy
                  Aug 16 '16 at 11:55




                  $begingroup$
                  I used this in an application to lower bound $(1+x/n)^n$, thank you.
                  $endgroup$
                  – JP McCarthy
                  Aug 16 '16 at 11:55












                  $begingroup$
                  Didier, I like this approach. So (+1). I was wondering if you've seen a way to establish the same lower bound for $left(1+frac xnright)^n$ by using the limit definition of the exponential function and without appealing to calculus. The upper bound is trivial for $x>-n$.
                  $endgroup$
                  – Mark Viola
                  Jan 5 '17 at 19:39






                  $begingroup$
                  Didier, I like this approach. So (+1). I was wondering if you've seen a way to establish the same lower bound for $left(1+frac xnright)^n$ by using the limit definition of the exponential function and without appealing to calculus. The upper bound is trivial for $x>-n$.
                  $endgroup$
                  – Mark Viola
                  Jan 5 '17 at 19:39














                  $begingroup$
                  @Dr.MV This reduces to showing $1+tgeqslant exp(t-t^2)$, that is, $frac1{1+t}leqslantexp(-t+t^2)$. What you call the trivial upper bound yields $frac1{1+t}=1-frac{t}{1+t}leqslantexpleft(-frac{t}{1+t}right)$ hence if $frac{t}{1+t}geqslant t-t^2$, we are done. This is asking that $tgeqslant (t-t^2)(1+t)=t(1-t^2)$ hence indeed, we are done for every $t$ in $(0,1)$. This fails for $t$ negative but similar arguments might work.
                  $endgroup$
                  – Did
                  Jan 8 '17 at 9:17






                  $begingroup$
                  @Dr.MV This reduces to showing $1+tgeqslant exp(t-t^2)$, that is, $frac1{1+t}leqslantexp(-t+t^2)$. What you call the trivial upper bound yields $frac1{1+t}=1-frac{t}{1+t}leqslantexpleft(-frac{t}{1+t}right)$ hence if $frac{t}{1+t}geqslant t-t^2$, we are done. This is asking that $tgeqslant (t-t^2)(1+t)=t(1-t^2)$ hence indeed, we are done for every $t$ in $(0,1)$. This fails for $t$ negative but similar arguments might work.
                  $endgroup$
                  – Did
                  Jan 8 '17 at 9:17














                  $begingroup$
                  Upvote for the revenge thing.
                  $endgroup$
                  – Math_QED
                  Nov 21 '18 at 22:02




                  $begingroup$
                  Upvote for the revenge thing.
                  $endgroup$
                  – Math_QED
                  Nov 21 '18 at 22:02











                  3












                  $begingroup$

                  Another answer, assuming $x>0$:



                  Let $f(x)=ln(x)$. Then we know that $f'(x)=1/x$. Also, by the definition of derivative, we can write
                  $$
                  begin{align}
                  f'(x)&=lim_{hto 0}frac{f(x+h)-f(x)}{h}\
                  &=lim_{hto 0}frac{ln(x+h)-ln(x)}{h}\
                  &=lim_{hto 0}frac{1}{h}lnfrac{x+h}{x}\
                  &=lim_{h to 0}lnleft(frac{x+h}{x}right)^frac{1}{h}\
                  &=lim_{hto 0}lnleft(1+frac{h}{x}right)^frac{1}{h}
                  end{align}
                  $$

                  Then, using the fact that $ln(x)$ is a continuous function for all $x$ in its domain, we can exchange the $lim$ and $ln$:
                  $$
                  f'(x)=lnlim_{hto 0}left(1+frac{h}{x}right)^frac{1}{h}
                  $$

                  Now, let $m=1/h$. Then $mtoinfty$ as $hto 0^+$, and
                  $$
                  f'(x)=lnlim_{mtoinfty}left(1+frac{1}{mx}right)^m
                  $$

                  Now, assuming $x>0$, define $n=mx^2$, and so $ntoinfty$ as $mtoinfty$. Then we can write
                  $$
                  f'(x)=lnlim_{ntoinfty}left[left(1+frac{x}{n}right)^nright]^{1/x^2}
                  $$

                  and from before, we still have $f'(x)=1/x$, so
                  $$
                  lnlim_{ntoinfty}left[left(1+frac{x}{n}right)^nright]^{1/x^2}=frac{1}{x}
                  $$

                  Exponentiating both sides, we find
                  $$
                  lim_{ntoinfty}left[left(1+frac{x}{n}right)^nright]^{1/x^2}=e^{1/x}
                  $$

                  Finally, raising both sides to the $x^2$, we find
                  $$
                  lim_{ntoinfty}left(1+frac{x}{n}right)^n=e^x
                  $$

                  EDIT: This idea actually works for all reals—if we use $f(x)=ln|x|$ instead, then we get eventually get:
                  $$
                  e^x=lim_{ntoinfty}left|1+frac{x}{n}right|^{n}=lim_{ntoinfty}left(1+frac{x}{n}right)^n
                  $$

                  Where the last equality come from the fact that $n$ always eventually dominates $x$, so that the absolute value function becomes redundant.



                  This leaves the case where $x=0$, but that is a trivial matter.






                  share|cite|improve this answer











                  $endgroup$


















                    3












                    $begingroup$

                    Another answer, assuming $x>0$:



                    Let $f(x)=ln(x)$. Then we know that $f'(x)=1/x$. Also, by the definition of derivative, we can write
                    $$
                    begin{align}
                    f'(x)&=lim_{hto 0}frac{f(x+h)-f(x)}{h}\
                    &=lim_{hto 0}frac{ln(x+h)-ln(x)}{h}\
                    &=lim_{hto 0}frac{1}{h}lnfrac{x+h}{x}\
                    &=lim_{h to 0}lnleft(frac{x+h}{x}right)^frac{1}{h}\
                    &=lim_{hto 0}lnleft(1+frac{h}{x}right)^frac{1}{h}
                    end{align}
                    $$

                    Then, using the fact that $ln(x)$ is a continuous function for all $x$ in its domain, we can exchange the $lim$ and $ln$:
                    $$
                    f'(x)=lnlim_{hto 0}left(1+frac{h}{x}right)^frac{1}{h}
                    $$

                    Now, let $m=1/h$. Then $mtoinfty$ as $hto 0^+$, and
                    $$
                    f'(x)=lnlim_{mtoinfty}left(1+frac{1}{mx}right)^m
                    $$

                    Now, assuming $x>0$, define $n=mx^2$, and so $ntoinfty$ as $mtoinfty$. Then we can write
                    $$
                    f'(x)=lnlim_{ntoinfty}left[left(1+frac{x}{n}right)^nright]^{1/x^2}
                    $$

                    and from before, we still have $f'(x)=1/x$, so
                    $$
                    lnlim_{ntoinfty}left[left(1+frac{x}{n}right)^nright]^{1/x^2}=frac{1}{x}
                    $$

                    Exponentiating both sides, we find
                    $$
                    lim_{ntoinfty}left[left(1+frac{x}{n}right)^nright]^{1/x^2}=e^{1/x}
                    $$

                    Finally, raising both sides to the $x^2$, we find
                    $$
                    lim_{ntoinfty}left(1+frac{x}{n}right)^n=e^x
                    $$

                    EDIT: This idea actually works for all reals—if we use $f(x)=ln|x|$ instead, then we get eventually get:
                    $$
                    e^x=lim_{ntoinfty}left|1+frac{x}{n}right|^{n}=lim_{ntoinfty}left(1+frac{x}{n}right)^n
                    $$

                    Where the last equality come from the fact that $n$ always eventually dominates $x$, so that the absolute value function becomes redundant.



                    This leaves the case where $x=0$, but that is a trivial matter.






                    share|cite|improve this answer











                    $endgroup$
















                      3












                      3








                      3





                      $begingroup$

                      Another answer, assuming $x>0$:



                      Let $f(x)=ln(x)$. Then we know that $f'(x)=1/x$. Also, by the definition of derivative, we can write
                      $$
                      begin{align}
                      f'(x)&=lim_{hto 0}frac{f(x+h)-f(x)}{h}\
                      &=lim_{hto 0}frac{ln(x+h)-ln(x)}{h}\
                      &=lim_{hto 0}frac{1}{h}lnfrac{x+h}{x}\
                      &=lim_{h to 0}lnleft(frac{x+h}{x}right)^frac{1}{h}\
                      &=lim_{hto 0}lnleft(1+frac{h}{x}right)^frac{1}{h}
                      end{align}
                      $$

                      Then, using the fact that $ln(x)$ is a continuous function for all $x$ in its domain, we can exchange the $lim$ and $ln$:
                      $$
                      f'(x)=lnlim_{hto 0}left(1+frac{h}{x}right)^frac{1}{h}
                      $$

                      Now, let $m=1/h$. Then $mtoinfty$ as $hto 0^+$, and
                      $$
                      f'(x)=lnlim_{mtoinfty}left(1+frac{1}{mx}right)^m
                      $$

                      Now, assuming $x>0$, define $n=mx^2$, and so $ntoinfty$ as $mtoinfty$. Then we can write
                      $$
                      f'(x)=lnlim_{ntoinfty}left[left(1+frac{x}{n}right)^nright]^{1/x^2}
                      $$

                      and from before, we still have $f'(x)=1/x$, so
                      $$
                      lnlim_{ntoinfty}left[left(1+frac{x}{n}right)^nright]^{1/x^2}=frac{1}{x}
                      $$

                      Exponentiating both sides, we find
                      $$
                      lim_{ntoinfty}left[left(1+frac{x}{n}right)^nright]^{1/x^2}=e^{1/x}
                      $$

                      Finally, raising both sides to the $x^2$, we find
                      $$
                      lim_{ntoinfty}left(1+frac{x}{n}right)^n=e^x
                      $$

                      EDIT: This idea actually works for all reals—if we use $f(x)=ln|x|$ instead, then we get eventually get:
                      $$
                      e^x=lim_{ntoinfty}left|1+frac{x}{n}right|^{n}=lim_{ntoinfty}left(1+frac{x}{n}right)^n
                      $$

                      Where the last equality come from the fact that $n$ always eventually dominates $x$, so that the absolute value function becomes redundant.



                      This leaves the case where $x=0$, but that is a trivial matter.






                      share|cite|improve this answer











                      $endgroup$



                      Another answer, assuming $x>0$:



                      Let $f(x)=ln(x)$. Then we know that $f'(x)=1/x$. Also, by the definition of derivative, we can write
                      $$
                      begin{align}
                      f'(x)&=lim_{hto 0}frac{f(x+h)-f(x)}{h}\
                      &=lim_{hto 0}frac{ln(x+h)-ln(x)}{h}\
                      &=lim_{hto 0}frac{1}{h}lnfrac{x+h}{x}\
                      &=lim_{h to 0}lnleft(frac{x+h}{x}right)^frac{1}{h}\
                      &=lim_{hto 0}lnleft(1+frac{h}{x}right)^frac{1}{h}
                      end{align}
                      $$

                      Then, using the fact that $ln(x)$ is a continuous function for all $x$ in its domain, we can exchange the $lim$ and $ln$:
                      $$
                      f'(x)=lnlim_{hto 0}left(1+frac{h}{x}right)^frac{1}{h}
                      $$

                      Now, let $m=1/h$. Then $mtoinfty$ as $hto 0^+$, and
                      $$
                      f'(x)=lnlim_{mtoinfty}left(1+frac{1}{mx}right)^m
                      $$

                      Now, assuming $x>0$, define $n=mx^2$, and so $ntoinfty$ as $mtoinfty$. Then we can write
                      $$
                      f'(x)=lnlim_{ntoinfty}left[left(1+frac{x}{n}right)^nright]^{1/x^2}
                      $$

                      and from before, we still have $f'(x)=1/x$, so
                      $$
                      lnlim_{ntoinfty}left[left(1+frac{x}{n}right)^nright]^{1/x^2}=frac{1}{x}
                      $$

                      Exponentiating both sides, we find
                      $$
                      lim_{ntoinfty}left[left(1+frac{x}{n}right)^nright]^{1/x^2}=e^{1/x}
                      $$

                      Finally, raising both sides to the $x^2$, we find
                      $$
                      lim_{ntoinfty}left(1+frac{x}{n}right)^n=e^x
                      $$

                      EDIT: This idea actually works for all reals—if we use $f(x)=ln|x|$ instead, then we get eventually get:
                      $$
                      e^x=lim_{ntoinfty}left|1+frac{x}{n}right|^{n}=lim_{ntoinfty}left(1+frac{x}{n}right)^n
                      $$

                      Where the last equality come from the fact that $n$ always eventually dominates $x$, so that the absolute value function becomes redundant.



                      This leaves the case where $x=0$, but that is a trivial matter.







                      share|cite|improve this answer














                      share|cite|improve this answer



                      share|cite|improve this answer








                      edited Jan 6 at 21:15









                      Alexander Sanchez

                      32




                      32










                      answered Sep 25 '15 at 1:22









                      Mike BellMike Bell

                      235313




                      235313























                          1












                          $begingroup$

                          $ (1+x/n)^n = sum_{k=0}^n binom{n}{k}frac{x^k}{n^k} $



                          Now just prove that $binom{n}{k}frac{x^k}{n^k}$ approaches $frac{x^k}{k!}$ as n approaches infinity, and you will have proven that your limit matches the Taylor series for $exp(x)$






                          share|cite|improve this answer











                          $endgroup$









                          • 5




                            $begingroup$
                            This is not enough; there are infinitely many terms, so you need to show that you can swap two limits here.
                            $endgroup$
                            – Qiaochu Yuan
                            Apr 11 '13 at 23:17






                          • 1




                            $begingroup$
                            What you want to do is work with $limsup$ and $liminf$ here, and show $e^xleqliminf $ and $e^xgeq limsup$
                            $endgroup$
                            – Pedro Tamaroff
                            Apr 11 '13 at 23:53










                          • $begingroup$
                            How would you show that you can swap the two limits?
                            $endgroup$
                            – amarney
                            Mar 26 '17 at 22:54
















                          1












                          $begingroup$

                          $ (1+x/n)^n = sum_{k=0}^n binom{n}{k}frac{x^k}{n^k} $



                          Now just prove that $binom{n}{k}frac{x^k}{n^k}$ approaches $frac{x^k}{k!}$ as n approaches infinity, and you will have proven that your limit matches the Taylor series for $exp(x)$






                          share|cite|improve this answer











                          $endgroup$









                          • 5




                            $begingroup$
                            This is not enough; there are infinitely many terms, so you need to show that you can swap two limits here.
                            $endgroup$
                            – Qiaochu Yuan
                            Apr 11 '13 at 23:17






                          • 1




                            $begingroup$
                            What you want to do is work with $limsup$ and $liminf$ here, and show $e^xleqliminf $ and $e^xgeq limsup$
                            $endgroup$
                            – Pedro Tamaroff
                            Apr 11 '13 at 23:53










                          • $begingroup$
                            How would you show that you can swap the two limits?
                            $endgroup$
                            – amarney
                            Mar 26 '17 at 22:54














                          1












                          1








                          1





                          $begingroup$

                          $ (1+x/n)^n = sum_{k=0}^n binom{n}{k}frac{x^k}{n^k} $



                          Now just prove that $binom{n}{k}frac{x^k}{n^k}$ approaches $frac{x^k}{k!}$ as n approaches infinity, and you will have proven that your limit matches the Taylor series for $exp(x)$






                          share|cite|improve this answer











                          $endgroup$



                          $ (1+x/n)^n = sum_{k=0}^n binom{n}{k}frac{x^k}{n^k} $



                          Now just prove that $binom{n}{k}frac{x^k}{n^k}$ approaches $frac{x^k}{k!}$ as n approaches infinity, and you will have proven that your limit matches the Taylor series for $exp(x)$







                          share|cite|improve this answer














                          share|cite|improve this answer



                          share|cite|improve this answer








                          edited Apr 12 '13 at 0:12

























                          answered Apr 11 '13 at 23:07









                          ThreeThree

                          5572512




                          5572512








                          • 5




                            $begingroup$
                            This is not enough; there are infinitely many terms, so you need to show that you can swap two limits here.
                            $endgroup$
                            – Qiaochu Yuan
                            Apr 11 '13 at 23:17






                          • 1




                            $begingroup$
                            What you want to do is work with $limsup$ and $liminf$ here, and show $e^xleqliminf $ and $e^xgeq limsup$
                            $endgroup$
                            – Pedro Tamaroff
                            Apr 11 '13 at 23:53










                          • $begingroup$
                            How would you show that you can swap the two limits?
                            $endgroup$
                            – amarney
                            Mar 26 '17 at 22:54














                          • 5




                            $begingroup$
                            This is not enough; there are infinitely many terms, so you need to show that you can swap two limits here.
                            $endgroup$
                            – Qiaochu Yuan
                            Apr 11 '13 at 23:17






                          • 1




                            $begingroup$
                            What you want to do is work with $limsup$ and $liminf$ here, and show $e^xleqliminf $ and $e^xgeq limsup$
                            $endgroup$
                            – Pedro Tamaroff
                            Apr 11 '13 at 23:53










                          • $begingroup$
                            How would you show that you can swap the two limits?
                            $endgroup$
                            – amarney
                            Mar 26 '17 at 22:54








                          5




                          5




                          $begingroup$
                          This is not enough; there are infinitely many terms, so you need to show that you can swap two limits here.
                          $endgroup$
                          – Qiaochu Yuan
                          Apr 11 '13 at 23:17




                          $begingroup$
                          This is not enough; there are infinitely many terms, so you need to show that you can swap two limits here.
                          $endgroup$
                          – Qiaochu Yuan
                          Apr 11 '13 at 23:17




                          1




                          1




                          $begingroup$
                          What you want to do is work with $limsup$ and $liminf$ here, and show $e^xleqliminf $ and $e^xgeq limsup$
                          $endgroup$
                          – Pedro Tamaroff
                          Apr 11 '13 at 23:53




                          $begingroup$
                          What you want to do is work with $limsup$ and $liminf$ here, and show $e^xleqliminf $ and $e^xgeq limsup$
                          $endgroup$
                          – Pedro Tamaroff
                          Apr 11 '13 at 23:53












                          $begingroup$
                          How would you show that you can swap the two limits?
                          $endgroup$
                          – amarney
                          Mar 26 '17 at 22:54




                          $begingroup$
                          How would you show that you can swap the two limits?
                          $endgroup$
                          – amarney
                          Mar 26 '17 at 22:54











                          1












                          $begingroup$

                          For any fixed value of $x$, define



                          $$f(u)= {ln(1+ux)over u}$$



                          By L'Hopital's Rule,



                          $$lim_{urightarrow0^+}f(u)=lim_{urightarrow0^+}{x/(1+ux)over1}=x$$



                          Now exponentiate $f$:



                          $$e^{f(u)}=(1+ux)^{1/u}$$



                          By continuity of the exponential function, we have



                          $$lim_{urightarrow0^+}(1+ux)^{1/u}=lim_{urightarrow0^+}e^{f(u)}=e^{lim_{urightarrow0^+}f(u)}=e^x$$



                          All these limits have been shown to exist for the (positive) real variable $u$ tending to $0$, hence they must exist, and be the same, for the sequence of reciprocals of integers, $u=1/n$, as $n$ tends to infinity, and the result follows:



                          $$lim_{nrightarrowinfty}left(1+{xover n}right)^n = e^x$$






                          share|cite|improve this answer









                          $endgroup$


















                            1












                            $begingroup$

                            For any fixed value of $x$, define



                            $$f(u)= {ln(1+ux)over u}$$



                            By L'Hopital's Rule,



                            $$lim_{urightarrow0^+}f(u)=lim_{urightarrow0^+}{x/(1+ux)over1}=x$$



                            Now exponentiate $f$:



                            $$e^{f(u)}=(1+ux)^{1/u}$$



                            By continuity of the exponential function, we have



                            $$lim_{urightarrow0^+}(1+ux)^{1/u}=lim_{urightarrow0^+}e^{f(u)}=e^{lim_{urightarrow0^+}f(u)}=e^x$$



                            All these limits have been shown to exist for the (positive) real variable $u$ tending to $0$, hence they must exist, and be the same, for the sequence of reciprocals of integers, $u=1/n$, as $n$ tends to infinity, and the result follows:



                            $$lim_{nrightarrowinfty}left(1+{xover n}right)^n = e^x$$






                            share|cite|improve this answer









                            $endgroup$
















                              1












                              1








                              1





                              $begingroup$

                              For any fixed value of $x$, define



                              $$f(u)= {ln(1+ux)over u}$$



                              By L'Hopital's Rule,



                              $$lim_{urightarrow0^+}f(u)=lim_{urightarrow0^+}{x/(1+ux)over1}=x$$



                              Now exponentiate $f$:



                              $$e^{f(u)}=(1+ux)^{1/u}$$



                              By continuity of the exponential function, we have



                              $$lim_{urightarrow0^+}(1+ux)^{1/u}=lim_{urightarrow0^+}e^{f(u)}=e^{lim_{urightarrow0^+}f(u)}=e^x$$



                              All these limits have been shown to exist for the (positive) real variable $u$ tending to $0$, hence they must exist, and be the same, for the sequence of reciprocals of integers, $u=1/n$, as $n$ tends to infinity, and the result follows:



                              $$lim_{nrightarrowinfty}left(1+{xover n}right)^n = e^x$$






                              share|cite|improve this answer









                              $endgroup$



                              For any fixed value of $x$, define



                              $$f(u)= {ln(1+ux)over u}$$



                              By L'Hopital's Rule,



                              $$lim_{urightarrow0^+}f(u)=lim_{urightarrow0^+}{x/(1+ux)over1}=x$$



                              Now exponentiate $f$:



                              $$e^{f(u)}=(1+ux)^{1/u}$$



                              By continuity of the exponential function, we have



                              $$lim_{urightarrow0^+}(1+ux)^{1/u}=lim_{urightarrow0^+}e^{f(u)}=e^{lim_{urightarrow0^+}f(u)}=e^x$$



                              All these limits have been shown to exist for the (positive) real variable $u$ tending to $0$, hence they must exist, and be the same, for the sequence of reciprocals of integers, $u=1/n$, as $n$ tends to infinity, and the result follows:



                              $$lim_{nrightarrowinfty}left(1+{xover n}right)^n = e^x$$







                              share|cite|improve this answer












                              share|cite|improve this answer



                              share|cite|improve this answer










                              answered Aug 10 '13 at 3:23









                              Barry CipraBarry Cipra

                              59.3k653125




                              59.3k653125























                                  0












                                  $begingroup$

                                  This one of the ways in which it is defined. The equivalence of the definitions can be proved easily, I guess.
                                  If for example you take the exponential function to be the inverse of the logarithm:



                                  $log(lim_n(1 + frac{x}{n})^n) = lim_n n log(1 + frac{x}{n}) = lim_n n cdot[frac{x}{n} - frac{x^2}{2n^2} + dots] = x$



                                  EDIT: The logarithm is defined as usual: $log x = int_1^x frac{dt}{t}$. The first identity follows from the continuity of the logarithm, the second it's just an application of one of the property of the logarithm ($log a^b = b log a $), while to obtain the third it sufficies to have the Taylor expansion of $log(1+x)$.






                                  share|cite|improve this answer











                                  $endgroup$













                                  • $begingroup$
                                    The very first equality requires, me believes, a justification that I cannot see as very easy unless we already assume quite a bit (say, continuity...). After that things get even tougher as we need power series and then also, apparently, differentiation.
                                    $endgroup$
                                    – DonAntonio
                                    Apr 11 '13 at 23:40






                                  • 2




                                    $begingroup$
                                    The logarithm is defined as $int_1^x frac{dt}{t}$, therefore, if we have integration we can also have continuity and differentiation, I suppose.
                                    $endgroup$
                                    – user67133
                                    Apr 11 '13 at 23:45












                                  • $begingroup$
                                    Perhaps so and also perhaps mentioning this could clear things out a little, since we don't know, apparently, what the OP's background is.
                                    $endgroup$
                                    – DonAntonio
                                    Apr 11 '13 at 23:47






                                  • 1




                                    $begingroup$
                                    I cannot but totally agree. Thank you for your suggestions, I am going to edit the post to make it clearer!
                                    $endgroup$
                                    – user67133
                                    Apr 12 '13 at 0:07
















                                  0












                                  $begingroup$

                                  This one of the ways in which it is defined. The equivalence of the definitions can be proved easily, I guess.
                                  If for example you take the exponential function to be the inverse of the logarithm:



                                  $log(lim_n(1 + frac{x}{n})^n) = lim_n n log(1 + frac{x}{n}) = lim_n n cdot[frac{x}{n} - frac{x^2}{2n^2} + dots] = x$



                                  EDIT: The logarithm is defined as usual: $log x = int_1^x frac{dt}{t}$. The first identity follows from the continuity of the logarithm, the second it's just an application of one of the property of the logarithm ($log a^b = b log a $), while to obtain the third it sufficies to have the Taylor expansion of $log(1+x)$.






                                  share|cite|improve this answer











                                  $endgroup$













                                  • $begingroup$
                                    The very first equality requires, me believes, a justification that I cannot see as very easy unless we already assume quite a bit (say, continuity...). After that things get even tougher as we need power series and then also, apparently, differentiation.
                                    $endgroup$
                                    – DonAntonio
                                    Apr 11 '13 at 23:40






                                  • 2




                                    $begingroup$
                                    The logarithm is defined as $int_1^x frac{dt}{t}$, therefore, if we have integration we can also have continuity and differentiation, I suppose.
                                    $endgroup$
                                    – user67133
                                    Apr 11 '13 at 23:45












                                  • $begingroup$
                                    Perhaps so and also perhaps mentioning this could clear things out a little, since we don't know, apparently, what the OP's background is.
                                    $endgroup$
                                    – DonAntonio
                                    Apr 11 '13 at 23:47






                                  • 1




                                    $begingroup$
                                    I cannot but totally agree. Thank you for your suggestions, I am going to edit the post to make it clearer!
                                    $endgroup$
                                    – user67133
                                    Apr 12 '13 at 0:07














                                  0












                                  0








                                  0





                                  $begingroup$

                                  This one of the ways in which it is defined. The equivalence of the definitions can be proved easily, I guess.
                                  If for example you take the exponential function to be the inverse of the logarithm:



                                  $log(lim_n(1 + frac{x}{n})^n) = lim_n n log(1 + frac{x}{n}) = lim_n n cdot[frac{x}{n} - frac{x^2}{2n^2} + dots] = x$



                                  EDIT: The logarithm is defined as usual: $log x = int_1^x frac{dt}{t}$. The first identity follows from the continuity of the logarithm, the second it's just an application of one of the property of the logarithm ($log a^b = b log a $), while to obtain the third it sufficies to have the Taylor expansion of $log(1+x)$.






                                  share|cite|improve this answer











                                  $endgroup$



                                  This one of the ways in which it is defined. The equivalence of the definitions can be proved easily, I guess.
                                  If for example you take the exponential function to be the inverse of the logarithm:



                                  $log(lim_n(1 + frac{x}{n})^n) = lim_n n log(1 + frac{x}{n}) = lim_n n cdot[frac{x}{n} - frac{x^2}{2n^2} + dots] = x$



                                  EDIT: The logarithm is defined as usual: $log x = int_1^x frac{dt}{t}$. The first identity follows from the continuity of the logarithm, the second it's just an application of one of the property of the logarithm ($log a^b = b log a $), while to obtain the third it sufficies to have the Taylor expansion of $log(1+x)$.







                                  share|cite|improve this answer














                                  share|cite|improve this answer



                                  share|cite|improve this answer








                                  edited Apr 12 '13 at 0:16

























                                  answered Apr 11 '13 at 23:08







                                  user67133



















                                  • $begingroup$
                                    The very first equality requires, me believes, a justification that I cannot see as very easy unless we already assume quite a bit (say, continuity...). After that things get even tougher as we need power series and then also, apparently, differentiation.
                                    $endgroup$
                                    – DonAntonio
                                    Apr 11 '13 at 23:40






                                  • 2




                                    $begingroup$
                                    The logarithm is defined as $int_1^x frac{dt}{t}$, therefore, if we have integration we can also have continuity and differentiation, I suppose.
                                    $endgroup$
                                    – user67133
                                    Apr 11 '13 at 23:45












                                  • $begingroup$
                                    Perhaps so and also perhaps mentioning this could clear things out a little, since we don't know, apparently, what the OP's background is.
                                    $endgroup$
                                    – DonAntonio
                                    Apr 11 '13 at 23:47






                                  • 1




                                    $begingroup$
                                    I cannot but totally agree. Thank you for your suggestions, I am going to edit the post to make it clearer!
                                    $endgroup$
                                    – user67133
                                    Apr 12 '13 at 0:07


















                                  • $begingroup$
                                    The very first equality requires, me believes, a justification that I cannot see as very easy unless we already assume quite a bit (say, continuity...). After that things get even tougher as we need power series and then also, apparently, differentiation.
                                    $endgroup$
                                    – DonAntonio
                                    Apr 11 '13 at 23:40






                                  • 2




                                    $begingroup$
                                    The logarithm is defined as $int_1^x frac{dt}{t}$, therefore, if we have integration we can also have continuity and differentiation, I suppose.
                                    $endgroup$
                                    – user67133
                                    Apr 11 '13 at 23:45












                                  • $begingroup$
                                    Perhaps so and also perhaps mentioning this could clear things out a little, since we don't know, apparently, what the OP's background is.
                                    $endgroup$
                                    – DonAntonio
                                    Apr 11 '13 at 23:47






                                  • 1




                                    $begingroup$
                                    I cannot but totally agree. Thank you for your suggestions, I am going to edit the post to make it clearer!
                                    $endgroup$
                                    – user67133
                                    Apr 12 '13 at 0:07
















                                  $begingroup$
                                  The very first equality requires, me believes, a justification that I cannot see as very easy unless we already assume quite a bit (say, continuity...). After that things get even tougher as we need power series and then also, apparently, differentiation.
                                  $endgroup$
                                  – DonAntonio
                                  Apr 11 '13 at 23:40




                                  $begingroup$
                                  The very first equality requires, me believes, a justification that I cannot see as very easy unless we already assume quite a bit (say, continuity...). After that things get even tougher as we need power series and then also, apparently, differentiation.
                                  $endgroup$
                                  – DonAntonio
                                  Apr 11 '13 at 23:40




                                  2




                                  2




                                  $begingroup$
                                  The logarithm is defined as $int_1^x frac{dt}{t}$, therefore, if we have integration we can also have continuity and differentiation, I suppose.
                                  $endgroup$
                                  – user67133
                                  Apr 11 '13 at 23:45






                                  $begingroup$
                                  The logarithm is defined as $int_1^x frac{dt}{t}$, therefore, if we have integration we can also have continuity and differentiation, I suppose.
                                  $endgroup$
                                  – user67133
                                  Apr 11 '13 at 23:45














                                  $begingroup$
                                  Perhaps so and also perhaps mentioning this could clear things out a little, since we don't know, apparently, what the OP's background is.
                                  $endgroup$
                                  – DonAntonio
                                  Apr 11 '13 at 23:47




                                  $begingroup$
                                  Perhaps so and also perhaps mentioning this could clear things out a little, since we don't know, apparently, what the OP's background is.
                                  $endgroup$
                                  – DonAntonio
                                  Apr 11 '13 at 23:47




                                  1




                                  1




                                  $begingroup$
                                  I cannot but totally agree. Thank you for your suggestions, I am going to edit the post to make it clearer!
                                  $endgroup$
                                  – user67133
                                  Apr 12 '13 at 0:07




                                  $begingroup$
                                  I cannot but totally agree. Thank you for your suggestions, I am going to edit the post to make it clearer!
                                  $endgroup$
                                  – user67133
                                  Apr 12 '13 at 0:07











                                  0












                                  $begingroup$

                                  There is at most one function $g$ on $mathbb{R}$ such that
                                  $$g'(x)=g(x)text{ for all } xtext{ in }mathbb{R}quadtext{and}quad g(0)=1,.$$
                                  If you let $f_n(x)=(1+x/n)^n$ and you can demonstrate that it compactly converges to some function $f$, you can demonstrate that $f'(x)=f(x)$ and $f(0)=1$. Likewise, if you take $f_n(x)=sum_{k=0}^n x^k/k!$ and demonstrate this sequence converges compactly, you can show that this limit satisfies the same conditions. Thus it doesn't matter what your definition is. The uniqueness criteria is what you should probably have in mind when you think of "the exponential".






                                  share|cite|improve this answer









                                  $endgroup$


















                                    0












                                    $begingroup$

                                    There is at most one function $g$ on $mathbb{R}$ such that
                                    $$g'(x)=g(x)text{ for all } xtext{ in }mathbb{R}quadtext{and}quad g(0)=1,.$$
                                    If you let $f_n(x)=(1+x/n)^n$ and you can demonstrate that it compactly converges to some function $f$, you can demonstrate that $f'(x)=f(x)$ and $f(0)=1$. Likewise, if you take $f_n(x)=sum_{k=0}^n x^k/k!$ and demonstrate this sequence converges compactly, you can show that this limit satisfies the same conditions. Thus it doesn't matter what your definition is. The uniqueness criteria is what you should probably have in mind when you think of "the exponential".






                                    share|cite|improve this answer









                                    $endgroup$
















                                      0












                                      0








                                      0





                                      $begingroup$

                                      There is at most one function $g$ on $mathbb{R}$ such that
                                      $$g'(x)=g(x)text{ for all } xtext{ in }mathbb{R}quadtext{and}quad g(0)=1,.$$
                                      If you let $f_n(x)=(1+x/n)^n$ and you can demonstrate that it compactly converges to some function $f$, you can demonstrate that $f'(x)=f(x)$ and $f(0)=1$. Likewise, if you take $f_n(x)=sum_{k=0}^n x^k/k!$ and demonstrate this sequence converges compactly, you can show that this limit satisfies the same conditions. Thus it doesn't matter what your definition is. The uniqueness criteria is what you should probably have in mind when you think of "the exponential".






                                      share|cite|improve this answer









                                      $endgroup$



                                      There is at most one function $g$ on $mathbb{R}$ such that
                                      $$g'(x)=g(x)text{ for all } xtext{ in }mathbb{R}quadtext{and}quad g(0)=1,.$$
                                      If you let $f_n(x)=(1+x/n)^n$ and you can demonstrate that it compactly converges to some function $f$, you can demonstrate that $f'(x)=f(x)$ and $f(0)=1$. Likewise, if you take $f_n(x)=sum_{k=0}^n x^k/k!$ and demonstrate this sequence converges compactly, you can show that this limit satisfies the same conditions. Thus it doesn't matter what your definition is. The uniqueness criteria is what you should probably have in mind when you think of "the exponential".







                                      share|cite|improve this answer












                                      share|cite|improve this answer



                                      share|cite|improve this answer










                                      answered Dec 15 '17 at 15:21









                                      Robert WolfeRobert Wolfe

                                      5,71422563




                                      5,71422563

















                                          protected by user99914 Nov 11 '17 at 4:03



                                          Thank you for your interest in this question.
                                          Because it has attracted low-quality or spam answers that had to be removed, posting an answer now requires 10 reputation on this site (the association bonus does not count).



                                          Would you like to answer one of these unanswered questions instead?



                                          Popular posts from this blog

                                          Mario Kart Wii

                                          The Binding of Isaac: Rebirth/Afterbirth

                                          What does “Dominus providebit” mean?