Geometric distribution of independent t random variables and their limit












2














let there be $X_1,X_2,...,X_t$ independent random variables that are distributed $X_itext{~}Geo(frac{1}{2})$ for every $1le i le t$. Show that there is a constant $c gt 0 $ so that for every $a gt 3$ and every $t ge 1$:



$$ P(X_1 + X_2 + ... + X_t ge a cdot t) le 2^{-c cdot a cdot t}$$



Mention specifically the constant and prove its correctness.



So far I tried using Chernoff bounds but I am not able to complete the calculation due to the $delta$, the expected value can be found due to the geometry of the the random variable, moreover is that they are independent so we can also find variance but I don't see a use for it at the moment.
I have also tried maybe getting the expression to the central limit theorem but that was a dead end as well, any help would be much appreciated.










share|cite|improve this question
























  • Are you sure you don't mean $2^{-cat}$? $2^{cat}$ is bigger than $1$, so the probability is trivially less than $2^{cat}$.
    – Mike Earnest
    2 days ago










  • Yes you are right, i edited.
    – LonelyStudent
    2 days ago
















2














let there be $X_1,X_2,...,X_t$ independent random variables that are distributed $X_itext{~}Geo(frac{1}{2})$ for every $1le i le t$. Show that there is a constant $c gt 0 $ so that for every $a gt 3$ and every $t ge 1$:



$$ P(X_1 + X_2 + ... + X_t ge a cdot t) le 2^{-c cdot a cdot t}$$



Mention specifically the constant and prove its correctness.



So far I tried using Chernoff bounds but I am not able to complete the calculation due to the $delta$, the expected value can be found due to the geometry of the the random variable, moreover is that they are independent so we can also find variance but I don't see a use for it at the moment.
I have also tried maybe getting the expression to the central limit theorem but that was a dead end as well, any help would be much appreciated.










share|cite|improve this question
























  • Are you sure you don't mean $2^{-cat}$? $2^{cat}$ is bigger than $1$, so the probability is trivially less than $2^{cat}$.
    – Mike Earnest
    2 days ago










  • Yes you are right, i edited.
    – LonelyStudent
    2 days ago














2












2








2







let there be $X_1,X_2,...,X_t$ independent random variables that are distributed $X_itext{~}Geo(frac{1}{2})$ for every $1le i le t$. Show that there is a constant $c gt 0 $ so that for every $a gt 3$ and every $t ge 1$:



$$ P(X_1 + X_2 + ... + X_t ge a cdot t) le 2^{-c cdot a cdot t}$$



Mention specifically the constant and prove its correctness.



So far I tried using Chernoff bounds but I am not able to complete the calculation due to the $delta$, the expected value can be found due to the geometry of the the random variable, moreover is that they are independent so we can also find variance but I don't see a use for it at the moment.
I have also tried maybe getting the expression to the central limit theorem but that was a dead end as well, any help would be much appreciated.










share|cite|improve this question















let there be $X_1,X_2,...,X_t$ independent random variables that are distributed $X_itext{~}Geo(frac{1}{2})$ for every $1le i le t$. Show that there is a constant $c gt 0 $ so that for every $a gt 3$ and every $t ge 1$:



$$ P(X_1 + X_2 + ... + X_t ge a cdot t) le 2^{-c cdot a cdot t}$$



Mention specifically the constant and prove its correctness.



So far I tried using Chernoff bounds but I am not able to complete the calculation due to the $delta$, the expected value can be found due to the geometry of the the random variable, moreover is that they are independent so we can also find variance but I don't see a use for it at the moment.
I have also tried maybe getting the expression to the central limit theorem but that was a dead end as well, any help would be much appreciated.







probability






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited 2 days ago







LonelyStudent

















asked 2 days ago









LonelyStudentLonelyStudent

523




523












  • Are you sure you don't mean $2^{-cat}$? $2^{cat}$ is bigger than $1$, so the probability is trivially less than $2^{cat}$.
    – Mike Earnest
    2 days ago










  • Yes you are right, i edited.
    – LonelyStudent
    2 days ago


















  • Are you sure you don't mean $2^{-cat}$? $2^{cat}$ is bigger than $1$, so the probability is trivially less than $2^{cat}$.
    – Mike Earnest
    2 days ago










  • Yes you are right, i edited.
    – LonelyStudent
    2 days ago
















Are you sure you don't mean $2^{-cat}$? $2^{cat}$ is bigger than $1$, so the probability is trivially less than $2^{cat}$.
– Mike Earnest
2 days ago




Are you sure you don't mean $2^{-cat}$? $2^{cat}$ is bigger than $1$, so the probability is trivially less than $2^{cat}$.
– Mike Earnest
2 days ago












Yes you are right, i edited.
– LonelyStudent
2 days ago




Yes you are right, i edited.
– LonelyStudent
2 days ago










1 Answer
1






active

oldest

votes


















0














$E[e^{sX_1}]=sum_k (1/2)^{k+1}e^{sk}=1/(2-e^s). $ Therefore, the mgf for $X_1+dots+X_t$ is $(2-e^{s})^{-t}$, valid for all $s<ln 2$.



For any $0<s<ln 2$,
$$
P(X_1+dots+X_tge at)=P(e^{s(X_1+dots+X_t)}ge e^{sat})lefrac{E[e^{s(X_1+dots+X_t)}]}{e^{sat}}= e^{-sat}(2-e^s)^{-t}.
$$

Now choose $s$ so that $2-e^s=e^{-1/2}$, namely $s=log(2-e^{-1/2})approx 0.33$. Then since $sa ge 0.33cdot 3ge 3/4$, we have $frac23 sage frac12$, so
$$
P(X_1+dots+X_tge at)le e^{-sat}cdot e^{t/2}=e^{-(sa-frac12)t}ge e^{-(sa-frac23sa)t}=2^{-(log_2 e)sat/3}.$$






share|cite|improve this answer























    Your Answer





    StackExchange.ifUsing("editor", function () {
    return StackExchange.using("mathjaxEditing", function () {
    StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
    StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
    });
    });
    }, "mathjax-editing");

    StackExchange.ready(function() {
    var channelOptions = {
    tags: "".split(" "),
    id: "69"
    };
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function() {
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled) {
    StackExchange.using("snippets", function() {
    createEditor();
    });
    }
    else {
    createEditor();
    }
    });

    function createEditor() {
    StackExchange.prepareEditor({
    heartbeatType: 'answer',
    autoActivateHeartbeat: false,
    convertImagesToLinks: true,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    imageUploader: {
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    },
    noCode: true, onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    });


    }
    });














    draft saved

    draft discarded


















    StackExchange.ready(
    function () {
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3062957%2fgeometric-distribution-of-independent-t-random-variables-and-their-limit%23new-answer', 'question_page');
    }
    );

    Post as a guest















    Required, but never shown

























    1 Answer
    1






    active

    oldest

    votes








    1 Answer
    1






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    0














    $E[e^{sX_1}]=sum_k (1/2)^{k+1}e^{sk}=1/(2-e^s). $ Therefore, the mgf for $X_1+dots+X_t$ is $(2-e^{s})^{-t}$, valid for all $s<ln 2$.



    For any $0<s<ln 2$,
    $$
    P(X_1+dots+X_tge at)=P(e^{s(X_1+dots+X_t)}ge e^{sat})lefrac{E[e^{s(X_1+dots+X_t)}]}{e^{sat}}= e^{-sat}(2-e^s)^{-t}.
    $$

    Now choose $s$ so that $2-e^s=e^{-1/2}$, namely $s=log(2-e^{-1/2})approx 0.33$. Then since $sa ge 0.33cdot 3ge 3/4$, we have $frac23 sage frac12$, so
    $$
    P(X_1+dots+X_tge at)le e^{-sat}cdot e^{t/2}=e^{-(sa-frac12)t}ge e^{-(sa-frac23sa)t}=2^{-(log_2 e)sat/3}.$$






    share|cite|improve this answer




























      0














      $E[e^{sX_1}]=sum_k (1/2)^{k+1}e^{sk}=1/(2-e^s). $ Therefore, the mgf for $X_1+dots+X_t$ is $(2-e^{s})^{-t}$, valid for all $s<ln 2$.



      For any $0<s<ln 2$,
      $$
      P(X_1+dots+X_tge at)=P(e^{s(X_1+dots+X_t)}ge e^{sat})lefrac{E[e^{s(X_1+dots+X_t)}]}{e^{sat}}= e^{-sat}(2-e^s)^{-t}.
      $$

      Now choose $s$ so that $2-e^s=e^{-1/2}$, namely $s=log(2-e^{-1/2})approx 0.33$. Then since $sa ge 0.33cdot 3ge 3/4$, we have $frac23 sage frac12$, so
      $$
      P(X_1+dots+X_tge at)le e^{-sat}cdot e^{t/2}=e^{-(sa-frac12)t}ge e^{-(sa-frac23sa)t}=2^{-(log_2 e)sat/3}.$$






      share|cite|improve this answer


























        0












        0








        0






        $E[e^{sX_1}]=sum_k (1/2)^{k+1}e^{sk}=1/(2-e^s). $ Therefore, the mgf for $X_1+dots+X_t$ is $(2-e^{s})^{-t}$, valid for all $s<ln 2$.



        For any $0<s<ln 2$,
        $$
        P(X_1+dots+X_tge at)=P(e^{s(X_1+dots+X_t)}ge e^{sat})lefrac{E[e^{s(X_1+dots+X_t)}]}{e^{sat}}= e^{-sat}(2-e^s)^{-t}.
        $$

        Now choose $s$ so that $2-e^s=e^{-1/2}$, namely $s=log(2-e^{-1/2})approx 0.33$. Then since $sa ge 0.33cdot 3ge 3/4$, we have $frac23 sage frac12$, so
        $$
        P(X_1+dots+X_tge at)le e^{-sat}cdot e^{t/2}=e^{-(sa-frac12)t}ge e^{-(sa-frac23sa)t}=2^{-(log_2 e)sat/3}.$$






        share|cite|improve this answer














        $E[e^{sX_1}]=sum_k (1/2)^{k+1}e^{sk}=1/(2-e^s). $ Therefore, the mgf for $X_1+dots+X_t$ is $(2-e^{s})^{-t}$, valid for all $s<ln 2$.



        For any $0<s<ln 2$,
        $$
        P(X_1+dots+X_tge at)=P(e^{s(X_1+dots+X_t)}ge e^{sat})lefrac{E[e^{s(X_1+dots+X_t)}]}{e^{sat}}= e^{-sat}(2-e^s)^{-t}.
        $$

        Now choose $s$ so that $2-e^s=e^{-1/2}$, namely $s=log(2-e^{-1/2})approx 0.33$. Then since $sa ge 0.33cdot 3ge 3/4$, we have $frac23 sage frac12$, so
        $$
        P(X_1+dots+X_tge at)le e^{-sat}cdot e^{t/2}=e^{-(sa-frac12)t}ge e^{-(sa-frac23sa)t}=2^{-(log_2 e)sat/3}.$$







        share|cite|improve this answer














        share|cite|improve this answer



        share|cite|improve this answer








        edited 16 hours ago

























        answered 2 days ago









        Mike EarnestMike Earnest

        20.6k11950




        20.6k11950






























            draft saved

            draft discarded




















































            Thanks for contributing an answer to Mathematics Stack Exchange!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            Use MathJax to format equations. MathJax reference.


            To learn more, see our tips on writing great answers.





            Some of your past answers have not been well-received, and you're in danger of being blocked from answering.


            Please pay close attention to the following guidance:


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3062957%2fgeometric-distribution-of-independent-t-random-variables-and-their-limit%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            Mario Kart Wii

            The Binding of Isaac: Rebirth/Afterbirth

            What does “Dominus providebit” mean?