$X_1,..,X_n$ i.i.d random variable and discrete, local limit theorem












7















Let $X_1,..., X_n$ be i.i.d discrete random variables which take their value in $mathbb{Z}$ with a non trivial and finite support. Let $S_n = X_1+...+ X_n$



Prove the existence of $ 0 < C_1 < C_2 < infty$ such that for all $ n geq 1$ :



$$ C_1/sqrt{n} leq sup_{k in mathbb{Z}} mathbb{P}(S_n = k) leq C_2/sqrt{n}$$




I know that this can be proved using the central limit theorem yet we didn’t see this theorem in class so this exercise can be solved without using CLT.



So far here are my thoughts :



The fact that it’s $sqrt{n}$ comes from the fact that the standard deviation of $S_n$ is $theta sqrt{n}$.
Hence one possible strategy is to study the random variable :
$frac{S_n- mu}{theta/sqrt{n}}$. Yet from now on I dind’t manage finding an upper bound on the probability.



For example using Markov inequality I get that :



$$mathbb{P}( mid frac{S_n}{n} -mu mid geq theta/sqrt{n}) leq frac{1}{n}$$



The problem is that it doesn’t help since I can’t say anything when $mid S_n/n -mu mid leq theta/sqrt{n}$.



Thank you !










share|cite|improve this question

















This question has an open bounty worth +100
reputation from auhasard ending in 4 days.


This question has not received enough attention.
















  • You can apply your Markov/Chebyshev inequality if you know the random variables have a finite variance (else you might need some type of truncation argument). That inequality you give is useful since there is an order-$sqrt{n}$ region about the mean such that we are there with probability at least $1/2$. And of course we must always be an integer.
    – Michael
    Jan 3 at 6:02










  • @Michael Yes you are right about variance, so it’s even harder to prove iron the general case (when we don’t necessarily have variance). Yet I still don’t see how to prove it when we have variance.
    – Jebfiffkkfnfolzbd
    Jan 3 at 17:01










  • You can just try to get one of the bounds at first (either the lower or upper). For example, what happens when most of the probability is concentrated on only a few of the options? Say, if probability $p$ is spread over $k$ options, there must be some option with at least...
    – Michael
    Jan 3 at 18:06












  • I don’t think this is true without extra conditions such as finite variance. For intuition with continuous variables, if ${X_i}$ are iid Cauchy random variables with PDF $f(x)$ then the average $S_n/n$ is Cauchy with the same PDF. So begin{align}P[S_n in [a,a+1]] &= P[S_n/n in [a/n, a/n+1/n]]\ &= int_{a/n}^{a/n+1/n} f(x)dx \&leq frac{f_{max}}{n} = O(1/n)end{align} Likely a "discrete Cauchy-type" random variable would have similar $O(1/n)$ behavior.
    – Michael
    2 days ago








  • 1




    @Jebfiffkkfnfolzbd : $X(Omega)$ is the set of all real numbers that $X$ can take, and it is always a subset of $mathbb{R}$ (for any random variable $X:Omega rightarrowmathbb{R}$). By "finite support" I think you mean that $X$ can only take a value from some finite interval $[a,b]$. This means that it indeed has a finite mean and variance. Since we also know that $X$ is integer valued, it means that $X$ can only take a finite number of (integer) values.
    – Michael
    yesterday


















7















Let $X_1,..., X_n$ be i.i.d discrete random variables which take their value in $mathbb{Z}$ with a non trivial and finite support. Let $S_n = X_1+...+ X_n$



Prove the existence of $ 0 < C_1 < C_2 < infty$ such that for all $ n geq 1$ :



$$ C_1/sqrt{n} leq sup_{k in mathbb{Z}} mathbb{P}(S_n = k) leq C_2/sqrt{n}$$




I know that this can be proved using the central limit theorem yet we didn’t see this theorem in class so this exercise can be solved without using CLT.



So far here are my thoughts :



The fact that it’s $sqrt{n}$ comes from the fact that the standard deviation of $S_n$ is $theta sqrt{n}$.
Hence one possible strategy is to study the random variable :
$frac{S_n- mu}{theta/sqrt{n}}$. Yet from now on I dind’t manage finding an upper bound on the probability.



For example using Markov inequality I get that :



$$mathbb{P}( mid frac{S_n}{n} -mu mid geq theta/sqrt{n}) leq frac{1}{n}$$



The problem is that it doesn’t help since I can’t say anything when $mid S_n/n -mu mid leq theta/sqrt{n}$.



Thank you !










share|cite|improve this question

















This question has an open bounty worth +100
reputation from auhasard ending in 4 days.


This question has not received enough attention.
















  • You can apply your Markov/Chebyshev inequality if you know the random variables have a finite variance (else you might need some type of truncation argument). That inequality you give is useful since there is an order-$sqrt{n}$ region about the mean such that we are there with probability at least $1/2$. And of course we must always be an integer.
    – Michael
    Jan 3 at 6:02










  • @Michael Yes you are right about variance, so it’s even harder to prove iron the general case (when we don’t necessarily have variance). Yet I still don’t see how to prove it when we have variance.
    – Jebfiffkkfnfolzbd
    Jan 3 at 17:01










  • You can just try to get one of the bounds at first (either the lower or upper). For example, what happens when most of the probability is concentrated on only a few of the options? Say, if probability $p$ is spread over $k$ options, there must be some option with at least...
    – Michael
    Jan 3 at 18:06












  • I don’t think this is true without extra conditions such as finite variance. For intuition with continuous variables, if ${X_i}$ are iid Cauchy random variables with PDF $f(x)$ then the average $S_n/n$ is Cauchy with the same PDF. So begin{align}P[S_n in [a,a+1]] &= P[S_n/n in [a/n, a/n+1/n]]\ &= int_{a/n}^{a/n+1/n} f(x)dx \&leq frac{f_{max}}{n} = O(1/n)end{align} Likely a "discrete Cauchy-type" random variable would have similar $O(1/n)$ behavior.
    – Michael
    2 days ago








  • 1




    @Jebfiffkkfnfolzbd : $X(Omega)$ is the set of all real numbers that $X$ can take, and it is always a subset of $mathbb{R}$ (for any random variable $X:Omega rightarrowmathbb{R}$). By "finite support" I think you mean that $X$ can only take a value from some finite interval $[a,b]$. This means that it indeed has a finite mean and variance. Since we also know that $X$ is integer valued, it means that $X$ can only take a finite number of (integer) values.
    – Michael
    yesterday
















7












7








7


5






Let $X_1,..., X_n$ be i.i.d discrete random variables which take their value in $mathbb{Z}$ with a non trivial and finite support. Let $S_n = X_1+...+ X_n$



Prove the existence of $ 0 < C_1 < C_2 < infty$ such that for all $ n geq 1$ :



$$ C_1/sqrt{n} leq sup_{k in mathbb{Z}} mathbb{P}(S_n = k) leq C_2/sqrt{n}$$




I know that this can be proved using the central limit theorem yet we didn’t see this theorem in class so this exercise can be solved without using CLT.



So far here are my thoughts :



The fact that it’s $sqrt{n}$ comes from the fact that the standard deviation of $S_n$ is $theta sqrt{n}$.
Hence one possible strategy is to study the random variable :
$frac{S_n- mu}{theta/sqrt{n}}$. Yet from now on I dind’t manage finding an upper bound on the probability.



For example using Markov inequality I get that :



$$mathbb{P}( mid frac{S_n}{n} -mu mid geq theta/sqrt{n}) leq frac{1}{n}$$



The problem is that it doesn’t help since I can’t say anything when $mid S_n/n -mu mid leq theta/sqrt{n}$.



Thank you !










share|cite|improve this question
















Let $X_1,..., X_n$ be i.i.d discrete random variables which take their value in $mathbb{Z}$ with a non trivial and finite support. Let $S_n = X_1+...+ X_n$



Prove the existence of $ 0 < C_1 < C_2 < infty$ such that for all $ n geq 1$ :



$$ C_1/sqrt{n} leq sup_{k in mathbb{Z}} mathbb{P}(S_n = k) leq C_2/sqrt{n}$$




I know that this can be proved using the central limit theorem yet we didn’t see this theorem in class so this exercise can be solved without using CLT.



So far here are my thoughts :



The fact that it’s $sqrt{n}$ comes from the fact that the standard deviation of $S_n$ is $theta sqrt{n}$.
Hence one possible strategy is to study the random variable :
$frac{S_n- mu}{theta/sqrt{n}}$. Yet from now on I dind’t manage finding an upper bound on the probability.



For example using Markov inequality I get that :



$$mathbb{P}( mid frac{S_n}{n} -mu mid geq theta/sqrt{n}) leq frac{1}{n}$$



The problem is that it doesn’t help since I can’t say anything when $mid S_n/n -mu mid leq theta/sqrt{n}$.



Thank you !







probability probability-theory probability-limit-theorems






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited 2 days ago







Jebfiffkkfnfolzbd

















asked Jan 2 at 14:45









JebfiffkkfnfolzbdJebfiffkkfnfolzbd

592




592






This question has an open bounty worth +100
reputation from auhasard ending in 4 days.


This question has not received enough attention.








This question has an open bounty worth +100
reputation from auhasard ending in 4 days.


This question has not received enough attention.














  • You can apply your Markov/Chebyshev inequality if you know the random variables have a finite variance (else you might need some type of truncation argument). That inequality you give is useful since there is an order-$sqrt{n}$ region about the mean such that we are there with probability at least $1/2$. And of course we must always be an integer.
    – Michael
    Jan 3 at 6:02










  • @Michael Yes you are right about variance, so it’s even harder to prove iron the general case (when we don’t necessarily have variance). Yet I still don’t see how to prove it when we have variance.
    – Jebfiffkkfnfolzbd
    Jan 3 at 17:01










  • You can just try to get one of the bounds at first (either the lower or upper). For example, what happens when most of the probability is concentrated on only a few of the options? Say, if probability $p$ is spread over $k$ options, there must be some option with at least...
    – Michael
    Jan 3 at 18:06












  • I don’t think this is true without extra conditions such as finite variance. For intuition with continuous variables, if ${X_i}$ are iid Cauchy random variables with PDF $f(x)$ then the average $S_n/n$ is Cauchy with the same PDF. So begin{align}P[S_n in [a,a+1]] &= P[S_n/n in [a/n, a/n+1/n]]\ &= int_{a/n}^{a/n+1/n} f(x)dx \&leq frac{f_{max}}{n} = O(1/n)end{align} Likely a "discrete Cauchy-type" random variable would have similar $O(1/n)$ behavior.
    – Michael
    2 days ago








  • 1




    @Jebfiffkkfnfolzbd : $X(Omega)$ is the set of all real numbers that $X$ can take, and it is always a subset of $mathbb{R}$ (for any random variable $X:Omega rightarrowmathbb{R}$). By "finite support" I think you mean that $X$ can only take a value from some finite interval $[a,b]$. This means that it indeed has a finite mean and variance. Since we also know that $X$ is integer valued, it means that $X$ can only take a finite number of (integer) values.
    – Michael
    yesterday




















  • You can apply your Markov/Chebyshev inequality if you know the random variables have a finite variance (else you might need some type of truncation argument). That inequality you give is useful since there is an order-$sqrt{n}$ region about the mean such that we are there with probability at least $1/2$. And of course we must always be an integer.
    – Michael
    Jan 3 at 6:02










  • @Michael Yes you are right about variance, so it’s even harder to prove iron the general case (when we don’t necessarily have variance). Yet I still don’t see how to prove it when we have variance.
    – Jebfiffkkfnfolzbd
    Jan 3 at 17:01










  • You can just try to get one of the bounds at first (either the lower or upper). For example, what happens when most of the probability is concentrated on only a few of the options? Say, if probability $p$ is spread over $k$ options, there must be some option with at least...
    – Michael
    Jan 3 at 18:06












  • I don’t think this is true without extra conditions such as finite variance. For intuition with continuous variables, if ${X_i}$ are iid Cauchy random variables with PDF $f(x)$ then the average $S_n/n$ is Cauchy with the same PDF. So begin{align}P[S_n in [a,a+1]] &= P[S_n/n in [a/n, a/n+1/n]]\ &= int_{a/n}^{a/n+1/n} f(x)dx \&leq frac{f_{max}}{n} = O(1/n)end{align} Likely a "discrete Cauchy-type" random variable would have similar $O(1/n)$ behavior.
    – Michael
    2 days ago








  • 1




    @Jebfiffkkfnfolzbd : $X(Omega)$ is the set of all real numbers that $X$ can take, and it is always a subset of $mathbb{R}$ (for any random variable $X:Omega rightarrowmathbb{R}$). By "finite support" I think you mean that $X$ can only take a value from some finite interval $[a,b]$. This means that it indeed has a finite mean and variance. Since we also know that $X$ is integer valued, it means that $X$ can only take a finite number of (integer) values.
    – Michael
    yesterday


















You can apply your Markov/Chebyshev inequality if you know the random variables have a finite variance (else you might need some type of truncation argument). That inequality you give is useful since there is an order-$sqrt{n}$ region about the mean such that we are there with probability at least $1/2$. And of course we must always be an integer.
– Michael
Jan 3 at 6:02




You can apply your Markov/Chebyshev inequality if you know the random variables have a finite variance (else you might need some type of truncation argument). That inequality you give is useful since there is an order-$sqrt{n}$ region about the mean such that we are there with probability at least $1/2$. And of course we must always be an integer.
– Michael
Jan 3 at 6:02












@Michael Yes you are right about variance, so it’s even harder to prove iron the general case (when we don’t necessarily have variance). Yet I still don’t see how to prove it when we have variance.
– Jebfiffkkfnfolzbd
Jan 3 at 17:01




@Michael Yes you are right about variance, so it’s even harder to prove iron the general case (when we don’t necessarily have variance). Yet I still don’t see how to prove it when we have variance.
– Jebfiffkkfnfolzbd
Jan 3 at 17:01












You can just try to get one of the bounds at first (either the lower or upper). For example, what happens when most of the probability is concentrated on only a few of the options? Say, if probability $p$ is spread over $k$ options, there must be some option with at least...
– Michael
Jan 3 at 18:06






You can just try to get one of the bounds at first (either the lower or upper). For example, what happens when most of the probability is concentrated on only a few of the options? Say, if probability $p$ is spread over $k$ options, there must be some option with at least...
– Michael
Jan 3 at 18:06














I don’t think this is true without extra conditions such as finite variance. For intuition with continuous variables, if ${X_i}$ are iid Cauchy random variables with PDF $f(x)$ then the average $S_n/n$ is Cauchy with the same PDF. So begin{align}P[S_n in [a,a+1]] &= P[S_n/n in [a/n, a/n+1/n]]\ &= int_{a/n}^{a/n+1/n} f(x)dx \&leq frac{f_{max}}{n} = O(1/n)end{align} Likely a "discrete Cauchy-type" random variable would have similar $O(1/n)$ behavior.
– Michael
2 days ago






I don’t think this is true without extra conditions such as finite variance. For intuition with continuous variables, if ${X_i}$ are iid Cauchy random variables with PDF $f(x)$ then the average $S_n/n$ is Cauchy with the same PDF. So begin{align}P[S_n in [a,a+1]] &= P[S_n/n in [a/n, a/n+1/n]]\ &= int_{a/n}^{a/n+1/n} f(x)dx \&leq frac{f_{max}}{n} = O(1/n)end{align} Likely a "discrete Cauchy-type" random variable would have similar $O(1/n)$ behavior.
– Michael
2 days ago






1




1




@Jebfiffkkfnfolzbd : $X(Omega)$ is the set of all real numbers that $X$ can take, and it is always a subset of $mathbb{R}$ (for any random variable $X:Omega rightarrowmathbb{R}$). By "finite support" I think you mean that $X$ can only take a value from some finite interval $[a,b]$. This means that it indeed has a finite mean and variance. Since we also know that $X$ is integer valued, it means that $X$ can only take a finite number of (integer) values.
– Michael
yesterday






@Jebfiffkkfnfolzbd : $X(Omega)$ is the set of all real numbers that $X$ can take, and it is always a subset of $mathbb{R}$ (for any random variable $X:Omega rightarrowmathbb{R}$). By "finite support" I think you mean that $X$ can only take a value from some finite interval $[a,b]$. This means that it indeed has a finite mean and variance. Since we also know that $X$ is integer valued, it means that $X$ can only take a finite number of (integer) values.
– Michael
yesterday












0






active

oldest

votes











Your Answer





StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");

StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














draft saved

draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3059544%2fx-1-x-n-i-i-d-random-variable-and-discrete-local-limit-theorem%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown

























0






active

oldest

votes








0






active

oldest

votes









active

oldest

votes






active

oldest

votes
















draft saved

draft discarded




















































Thanks for contributing an answer to Mathematics Stack Exchange!


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


Use MathJax to format equations. MathJax reference.


To learn more, see our tips on writing great answers.





Some of your past answers have not been well-received, and you're in danger of being blocked from answering.


Please pay close attention to the following guidance:


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3059544%2fx-1-x-n-i-i-d-random-variable-and-discrete-local-limit-theorem%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

Mario Kart Wii

What does “Dominus providebit” mean?

Antonio Litta Visconti Arese