Probability distribution for an infinite number of fair coin flips?












0












$begingroup$


Given an infinite number of unbiased coin flips, what is the probability that some portion $xin(0,1)$ of all coin flips will be heads?





Edit: Alright, I've added what I've done so for. Please keep in mind that I did not know what I was doing when I started this problem, and I still don't. If I'm off-base here, feel free to say so. And please don't just downvote and tell me that I'm wrong. This is not Reddit. If you're going to respond to the question please do so with the intention of either requesting clarification or providing useful information.



What I've done so far



The discrete probability that some number $k$, of $n$ total coin tosses will land on heads is given by the binomial distribution:
$$p_{n,k}=frac{1}{2^n}frac{n!}{k!(n-k)!}qquad0leq kleq n$$
This distribution can be modified so that the domain lies in the interval $[0,1]$ with $k$ indicating the portion of coins that land on heads:
$$p_{n,k}=frac{1}{2^n}frac{n!}{(nk)!(n-nk)!}qquad 0leq kleq 1$$
For all finite $k=frac{a}{n}$ where $0leq aleq n$, $p_{n,k}$ is the probability that $kcdot n$ coins are heads. Note that the sum of all $p_{n,k}$ is always $1$.



Now, for an infinite number of coin tosses, $p$ needs to be continuous on $(0,1)$. The extension of $p$ to the reals which satisfies this requirement is obtained via continuation of the above discrete probability distribution, given by:



$$p_n(x)=frac{1}{2^n}frac{n!}{Gamma(nx+1)Gammaleft(n-nx+1right)!}=frac{n!}{(-2)^npi}frac{Gamma(nx-n)}{Gamma(nx+1)}sinleft(npi xright)$$



Allowing that $p_n(c)=lim_{xto c}p_n(x)$*, $p_n$ is continuous on $(0,1)$ for all $n$. It seems reasonable to assume that the desired probability distribution would be obtained from the limit $ntoinfty$ of $p_n(x)$. However, this may not be the case, not least because the limit seems completely impossible to evaluate algebraically. Furthermore, the limit seems to converge to $0$ for all $x$**enter image description here
Since there is no way to evaluate the limit algebraically, I decided to use a very contrived and non-rigorous non-standard approach:



Assume that each infinitesimal is the limit of a unique sequence $a_n, ntoinfty$. Assume that the sum of an infinite number of infinitesimals is always a finite number. The limit $ntoinfty$ of the sum $sum_{xin(0,1)}p_n(x)$ converges if $p_n(x)$ is infinitesimal for all $xin(0,1)$, and infinite otherwise.



That the sum does not converge for finite, non zero [non-infinitesimal] $p_n$ can be assumed from the non-uniform convergence of the limit (as seen in the picture), which would suggest that (1) the graph of $p_infty$ is symmetric about the line $x=0.5$ (2) the maximum value of $p_infty$ is located at $x=0.5$. If $p_infty(0.5)$ is finite, then for any sufficiently small real number '$a$', $p_infty(0.5pm a)approx p_infty(0.5)$. We may approximate the sum $sum_{xin(0.5-a,0.5+a)}p_infty(x)$ as $inftycdot p_infty(0.5)$ which is, of course, infinite if $p_infty(0.5)$ is non-infinitesimal. This is why the limit of $p_n(x)$ as $n$ goes to $infty$ must be infinitesimal for all $x$, especially given that the sum $sum_{xin(0,1)}p_n(x)$ does seem to converge to $1$ as $ntoinfty$.



In any case, as the probability of an event within an interval $(a,b)$ is given by the definite integral over $(a,b)$ for a continuous probability distribution, the fact that $p_n(x)$ is infinitesimal means that the integral $int_0^1 p_infty(x) dx$ is zero. This is a problem, because the probability that between $0%$ and $100%$ of an infinite number of coin flips comes up heads is obviously $1$, not $0$.



So now I have the requirement:



$$int_0^1 p(x) dx=1$$



Along with $max{p(x)}=p(0.5)$, $p(0)=0$ and $p(1)=0$, which is more or less 'common sense'. Additionally, the continuous probability distribution should be proportional to the original limit of the binomial distribution. The final distribution would be given, I think, by a smooth piecewise function:



$$p(x)=begin{cases}PDF & xin(0,1)\ 0 & xnotin(0,1)end{cases}qquadbiggvertqquadint_0^1 p(x) dx=1$$



*This is necessary because there are several points where $p_n$ is undefined.



**This is almost irrelevant, because the highest $n$ for which $p$ can even be computed (using any software available to me) is 134.










share|cite|improve this question











$endgroup$








  • 1




    $begingroup$
    Let $x_n=1$ if coin $n$ is heads and $x_n=0$ otherwise. I expect the probability that $limfrac1Nsum_{n=1}^Nx_n=frac12$ to be $1$.
    $endgroup$
    – SmileyCraft
    Jan 20 at 1:42










  • $begingroup$
    Could you explain how you got this? I tried something similar already by modifying the limit of a binomial distribution, but everything ended up being 0.
    $endgroup$
    – R. Burton
    Jan 20 at 17:44










  • $begingroup$
    You can probably show for every $r<frac12$ that $limfrac1Nsum_{n=1}^Nx_n<r$ has probability $0$. Then use symmetry and some measure argument to conclude the beforementioned result.
    $endgroup$
    – SmileyCraft
    Jan 20 at 18:43










  • $begingroup$
    When I first attempted this using the modified binomial distribution, I actually found that every event has probability 0, they just converge to zero at different rates. This is a natural consequence of starting from a discrete probability, since the sum of discrete probabilities must equal 1. If each event is one of an infinite number, then it can only be infinitesimally likely, otherwise their sum would exceed 1 by an infinite amount.
    $endgroup$
    – R. Burton
    Jan 20 at 18:59
















0












$begingroup$


Given an infinite number of unbiased coin flips, what is the probability that some portion $xin(0,1)$ of all coin flips will be heads?





Edit: Alright, I've added what I've done so for. Please keep in mind that I did not know what I was doing when I started this problem, and I still don't. If I'm off-base here, feel free to say so. And please don't just downvote and tell me that I'm wrong. This is not Reddit. If you're going to respond to the question please do so with the intention of either requesting clarification or providing useful information.



What I've done so far



The discrete probability that some number $k$, of $n$ total coin tosses will land on heads is given by the binomial distribution:
$$p_{n,k}=frac{1}{2^n}frac{n!}{k!(n-k)!}qquad0leq kleq n$$
This distribution can be modified so that the domain lies in the interval $[0,1]$ with $k$ indicating the portion of coins that land on heads:
$$p_{n,k}=frac{1}{2^n}frac{n!}{(nk)!(n-nk)!}qquad 0leq kleq 1$$
For all finite $k=frac{a}{n}$ where $0leq aleq n$, $p_{n,k}$ is the probability that $kcdot n$ coins are heads. Note that the sum of all $p_{n,k}$ is always $1$.



Now, for an infinite number of coin tosses, $p$ needs to be continuous on $(0,1)$. The extension of $p$ to the reals which satisfies this requirement is obtained via continuation of the above discrete probability distribution, given by:



$$p_n(x)=frac{1}{2^n}frac{n!}{Gamma(nx+1)Gammaleft(n-nx+1right)!}=frac{n!}{(-2)^npi}frac{Gamma(nx-n)}{Gamma(nx+1)}sinleft(npi xright)$$



Allowing that $p_n(c)=lim_{xto c}p_n(x)$*, $p_n$ is continuous on $(0,1)$ for all $n$. It seems reasonable to assume that the desired probability distribution would be obtained from the limit $ntoinfty$ of $p_n(x)$. However, this may not be the case, not least because the limit seems completely impossible to evaluate algebraically. Furthermore, the limit seems to converge to $0$ for all $x$**enter image description here
Since there is no way to evaluate the limit algebraically, I decided to use a very contrived and non-rigorous non-standard approach:



Assume that each infinitesimal is the limit of a unique sequence $a_n, ntoinfty$. Assume that the sum of an infinite number of infinitesimals is always a finite number. The limit $ntoinfty$ of the sum $sum_{xin(0,1)}p_n(x)$ converges if $p_n(x)$ is infinitesimal for all $xin(0,1)$, and infinite otherwise.



That the sum does not converge for finite, non zero [non-infinitesimal] $p_n$ can be assumed from the non-uniform convergence of the limit (as seen in the picture), which would suggest that (1) the graph of $p_infty$ is symmetric about the line $x=0.5$ (2) the maximum value of $p_infty$ is located at $x=0.5$. If $p_infty(0.5)$ is finite, then for any sufficiently small real number '$a$', $p_infty(0.5pm a)approx p_infty(0.5)$. We may approximate the sum $sum_{xin(0.5-a,0.5+a)}p_infty(x)$ as $inftycdot p_infty(0.5)$ which is, of course, infinite if $p_infty(0.5)$ is non-infinitesimal. This is why the limit of $p_n(x)$ as $n$ goes to $infty$ must be infinitesimal for all $x$, especially given that the sum $sum_{xin(0,1)}p_n(x)$ does seem to converge to $1$ as $ntoinfty$.



In any case, as the probability of an event within an interval $(a,b)$ is given by the definite integral over $(a,b)$ for a continuous probability distribution, the fact that $p_n(x)$ is infinitesimal means that the integral $int_0^1 p_infty(x) dx$ is zero. This is a problem, because the probability that between $0%$ and $100%$ of an infinite number of coin flips comes up heads is obviously $1$, not $0$.



So now I have the requirement:



$$int_0^1 p(x) dx=1$$



Along with $max{p(x)}=p(0.5)$, $p(0)=0$ and $p(1)=0$, which is more or less 'common sense'. Additionally, the continuous probability distribution should be proportional to the original limit of the binomial distribution. The final distribution would be given, I think, by a smooth piecewise function:



$$p(x)=begin{cases}PDF & xin(0,1)\ 0 & xnotin(0,1)end{cases}qquadbiggvertqquadint_0^1 p(x) dx=1$$



*This is necessary because there are several points where $p_n$ is undefined.



**This is almost irrelevant, because the highest $n$ for which $p$ can even be computed (using any software available to me) is 134.










share|cite|improve this question











$endgroup$








  • 1




    $begingroup$
    Let $x_n=1$ if coin $n$ is heads and $x_n=0$ otherwise. I expect the probability that $limfrac1Nsum_{n=1}^Nx_n=frac12$ to be $1$.
    $endgroup$
    – SmileyCraft
    Jan 20 at 1:42










  • $begingroup$
    Could you explain how you got this? I tried something similar already by modifying the limit of a binomial distribution, but everything ended up being 0.
    $endgroup$
    – R. Burton
    Jan 20 at 17:44










  • $begingroup$
    You can probably show for every $r<frac12$ that $limfrac1Nsum_{n=1}^Nx_n<r$ has probability $0$. Then use symmetry and some measure argument to conclude the beforementioned result.
    $endgroup$
    – SmileyCraft
    Jan 20 at 18:43










  • $begingroup$
    When I first attempted this using the modified binomial distribution, I actually found that every event has probability 0, they just converge to zero at different rates. This is a natural consequence of starting from a discrete probability, since the sum of discrete probabilities must equal 1. If each event is one of an infinite number, then it can only be infinitesimally likely, otherwise their sum would exceed 1 by an infinite amount.
    $endgroup$
    – R. Burton
    Jan 20 at 18:59














0












0








0





$begingroup$


Given an infinite number of unbiased coin flips, what is the probability that some portion $xin(0,1)$ of all coin flips will be heads?





Edit: Alright, I've added what I've done so for. Please keep in mind that I did not know what I was doing when I started this problem, and I still don't. If I'm off-base here, feel free to say so. And please don't just downvote and tell me that I'm wrong. This is not Reddit. If you're going to respond to the question please do so with the intention of either requesting clarification or providing useful information.



What I've done so far



The discrete probability that some number $k$, of $n$ total coin tosses will land on heads is given by the binomial distribution:
$$p_{n,k}=frac{1}{2^n}frac{n!}{k!(n-k)!}qquad0leq kleq n$$
This distribution can be modified so that the domain lies in the interval $[0,1]$ with $k$ indicating the portion of coins that land on heads:
$$p_{n,k}=frac{1}{2^n}frac{n!}{(nk)!(n-nk)!}qquad 0leq kleq 1$$
For all finite $k=frac{a}{n}$ where $0leq aleq n$, $p_{n,k}$ is the probability that $kcdot n$ coins are heads. Note that the sum of all $p_{n,k}$ is always $1$.



Now, for an infinite number of coin tosses, $p$ needs to be continuous on $(0,1)$. The extension of $p$ to the reals which satisfies this requirement is obtained via continuation of the above discrete probability distribution, given by:



$$p_n(x)=frac{1}{2^n}frac{n!}{Gamma(nx+1)Gammaleft(n-nx+1right)!}=frac{n!}{(-2)^npi}frac{Gamma(nx-n)}{Gamma(nx+1)}sinleft(npi xright)$$



Allowing that $p_n(c)=lim_{xto c}p_n(x)$*, $p_n$ is continuous on $(0,1)$ for all $n$. It seems reasonable to assume that the desired probability distribution would be obtained from the limit $ntoinfty$ of $p_n(x)$. However, this may not be the case, not least because the limit seems completely impossible to evaluate algebraically. Furthermore, the limit seems to converge to $0$ for all $x$**enter image description here
Since there is no way to evaluate the limit algebraically, I decided to use a very contrived and non-rigorous non-standard approach:



Assume that each infinitesimal is the limit of a unique sequence $a_n, ntoinfty$. Assume that the sum of an infinite number of infinitesimals is always a finite number. The limit $ntoinfty$ of the sum $sum_{xin(0,1)}p_n(x)$ converges if $p_n(x)$ is infinitesimal for all $xin(0,1)$, and infinite otherwise.



That the sum does not converge for finite, non zero [non-infinitesimal] $p_n$ can be assumed from the non-uniform convergence of the limit (as seen in the picture), which would suggest that (1) the graph of $p_infty$ is symmetric about the line $x=0.5$ (2) the maximum value of $p_infty$ is located at $x=0.5$. If $p_infty(0.5)$ is finite, then for any sufficiently small real number '$a$', $p_infty(0.5pm a)approx p_infty(0.5)$. We may approximate the sum $sum_{xin(0.5-a,0.5+a)}p_infty(x)$ as $inftycdot p_infty(0.5)$ which is, of course, infinite if $p_infty(0.5)$ is non-infinitesimal. This is why the limit of $p_n(x)$ as $n$ goes to $infty$ must be infinitesimal for all $x$, especially given that the sum $sum_{xin(0,1)}p_n(x)$ does seem to converge to $1$ as $ntoinfty$.



In any case, as the probability of an event within an interval $(a,b)$ is given by the definite integral over $(a,b)$ for a continuous probability distribution, the fact that $p_n(x)$ is infinitesimal means that the integral $int_0^1 p_infty(x) dx$ is zero. This is a problem, because the probability that between $0%$ and $100%$ of an infinite number of coin flips comes up heads is obviously $1$, not $0$.



So now I have the requirement:



$$int_0^1 p(x) dx=1$$



Along with $max{p(x)}=p(0.5)$, $p(0)=0$ and $p(1)=0$, which is more or less 'common sense'. Additionally, the continuous probability distribution should be proportional to the original limit of the binomial distribution. The final distribution would be given, I think, by a smooth piecewise function:



$$p(x)=begin{cases}PDF & xin(0,1)\ 0 & xnotin(0,1)end{cases}qquadbiggvertqquadint_0^1 p(x) dx=1$$



*This is necessary because there are several points where $p_n$ is undefined.



**This is almost irrelevant, because the highest $n$ for which $p$ can even be computed (using any software available to me) is 134.










share|cite|improve this question











$endgroup$




Given an infinite number of unbiased coin flips, what is the probability that some portion $xin(0,1)$ of all coin flips will be heads?





Edit: Alright, I've added what I've done so for. Please keep in mind that I did not know what I was doing when I started this problem, and I still don't. If I'm off-base here, feel free to say so. And please don't just downvote and tell me that I'm wrong. This is not Reddit. If you're going to respond to the question please do so with the intention of either requesting clarification or providing useful information.



What I've done so far



The discrete probability that some number $k$, of $n$ total coin tosses will land on heads is given by the binomial distribution:
$$p_{n,k}=frac{1}{2^n}frac{n!}{k!(n-k)!}qquad0leq kleq n$$
This distribution can be modified so that the domain lies in the interval $[0,1]$ with $k$ indicating the portion of coins that land on heads:
$$p_{n,k}=frac{1}{2^n}frac{n!}{(nk)!(n-nk)!}qquad 0leq kleq 1$$
For all finite $k=frac{a}{n}$ where $0leq aleq n$, $p_{n,k}$ is the probability that $kcdot n$ coins are heads. Note that the sum of all $p_{n,k}$ is always $1$.



Now, for an infinite number of coin tosses, $p$ needs to be continuous on $(0,1)$. The extension of $p$ to the reals which satisfies this requirement is obtained via continuation of the above discrete probability distribution, given by:



$$p_n(x)=frac{1}{2^n}frac{n!}{Gamma(nx+1)Gammaleft(n-nx+1right)!}=frac{n!}{(-2)^npi}frac{Gamma(nx-n)}{Gamma(nx+1)}sinleft(npi xright)$$



Allowing that $p_n(c)=lim_{xto c}p_n(x)$*, $p_n$ is continuous on $(0,1)$ for all $n$. It seems reasonable to assume that the desired probability distribution would be obtained from the limit $ntoinfty$ of $p_n(x)$. However, this may not be the case, not least because the limit seems completely impossible to evaluate algebraically. Furthermore, the limit seems to converge to $0$ for all $x$**enter image description here
Since there is no way to evaluate the limit algebraically, I decided to use a very contrived and non-rigorous non-standard approach:



Assume that each infinitesimal is the limit of a unique sequence $a_n, ntoinfty$. Assume that the sum of an infinite number of infinitesimals is always a finite number. The limit $ntoinfty$ of the sum $sum_{xin(0,1)}p_n(x)$ converges if $p_n(x)$ is infinitesimal for all $xin(0,1)$, and infinite otherwise.



That the sum does not converge for finite, non zero [non-infinitesimal] $p_n$ can be assumed from the non-uniform convergence of the limit (as seen in the picture), which would suggest that (1) the graph of $p_infty$ is symmetric about the line $x=0.5$ (2) the maximum value of $p_infty$ is located at $x=0.5$. If $p_infty(0.5)$ is finite, then for any sufficiently small real number '$a$', $p_infty(0.5pm a)approx p_infty(0.5)$. We may approximate the sum $sum_{xin(0.5-a,0.5+a)}p_infty(x)$ as $inftycdot p_infty(0.5)$ which is, of course, infinite if $p_infty(0.5)$ is non-infinitesimal. This is why the limit of $p_n(x)$ as $n$ goes to $infty$ must be infinitesimal for all $x$, especially given that the sum $sum_{xin(0,1)}p_n(x)$ does seem to converge to $1$ as $ntoinfty$.



In any case, as the probability of an event within an interval $(a,b)$ is given by the definite integral over $(a,b)$ for a continuous probability distribution, the fact that $p_n(x)$ is infinitesimal means that the integral $int_0^1 p_infty(x) dx$ is zero. This is a problem, because the probability that between $0%$ and $100%$ of an infinite number of coin flips comes up heads is obviously $1$, not $0$.



So now I have the requirement:



$$int_0^1 p(x) dx=1$$



Along with $max{p(x)}=p(0.5)$, $p(0)=0$ and $p(1)=0$, which is more or less 'common sense'. Additionally, the continuous probability distribution should be proportional to the original limit of the binomial distribution. The final distribution would be given, I think, by a smooth piecewise function:



$$p(x)=begin{cases}PDF & xin(0,1)\ 0 & xnotin(0,1)end{cases}qquadbiggvertqquadint_0^1 p(x) dx=1$$



*This is necessary because there are several points where $p_n$ is undefined.



**This is almost irrelevant, because the highest $n$ for which $p$ can even be computed (using any software available to me) is 134.







probability probability-distributions






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Jan 21 at 2:58







R. Burton

















asked Jan 20 at 1:36









R. BurtonR. Burton

530110




530110








  • 1




    $begingroup$
    Let $x_n=1$ if coin $n$ is heads and $x_n=0$ otherwise. I expect the probability that $limfrac1Nsum_{n=1}^Nx_n=frac12$ to be $1$.
    $endgroup$
    – SmileyCraft
    Jan 20 at 1:42










  • $begingroup$
    Could you explain how you got this? I tried something similar already by modifying the limit of a binomial distribution, but everything ended up being 0.
    $endgroup$
    – R. Burton
    Jan 20 at 17:44










  • $begingroup$
    You can probably show for every $r<frac12$ that $limfrac1Nsum_{n=1}^Nx_n<r$ has probability $0$. Then use symmetry and some measure argument to conclude the beforementioned result.
    $endgroup$
    – SmileyCraft
    Jan 20 at 18:43










  • $begingroup$
    When I first attempted this using the modified binomial distribution, I actually found that every event has probability 0, they just converge to zero at different rates. This is a natural consequence of starting from a discrete probability, since the sum of discrete probabilities must equal 1. If each event is one of an infinite number, then it can only be infinitesimally likely, otherwise their sum would exceed 1 by an infinite amount.
    $endgroup$
    – R. Burton
    Jan 20 at 18:59














  • 1




    $begingroup$
    Let $x_n=1$ if coin $n$ is heads and $x_n=0$ otherwise. I expect the probability that $limfrac1Nsum_{n=1}^Nx_n=frac12$ to be $1$.
    $endgroup$
    – SmileyCraft
    Jan 20 at 1:42










  • $begingroup$
    Could you explain how you got this? I tried something similar already by modifying the limit of a binomial distribution, but everything ended up being 0.
    $endgroup$
    – R. Burton
    Jan 20 at 17:44










  • $begingroup$
    You can probably show for every $r<frac12$ that $limfrac1Nsum_{n=1}^Nx_n<r$ has probability $0$. Then use symmetry and some measure argument to conclude the beforementioned result.
    $endgroup$
    – SmileyCraft
    Jan 20 at 18:43










  • $begingroup$
    When I first attempted this using the modified binomial distribution, I actually found that every event has probability 0, they just converge to zero at different rates. This is a natural consequence of starting from a discrete probability, since the sum of discrete probabilities must equal 1. If each event is one of an infinite number, then it can only be infinitesimally likely, otherwise their sum would exceed 1 by an infinite amount.
    $endgroup$
    – R. Burton
    Jan 20 at 18:59








1




1




$begingroup$
Let $x_n=1$ if coin $n$ is heads and $x_n=0$ otherwise. I expect the probability that $limfrac1Nsum_{n=1}^Nx_n=frac12$ to be $1$.
$endgroup$
– SmileyCraft
Jan 20 at 1:42




$begingroup$
Let $x_n=1$ if coin $n$ is heads and $x_n=0$ otherwise. I expect the probability that $limfrac1Nsum_{n=1}^Nx_n=frac12$ to be $1$.
$endgroup$
– SmileyCraft
Jan 20 at 1:42












$begingroup$
Could you explain how you got this? I tried something similar already by modifying the limit of a binomial distribution, but everything ended up being 0.
$endgroup$
– R. Burton
Jan 20 at 17:44




$begingroup$
Could you explain how you got this? I tried something similar already by modifying the limit of a binomial distribution, but everything ended up being 0.
$endgroup$
– R. Burton
Jan 20 at 17:44












$begingroup$
You can probably show for every $r<frac12$ that $limfrac1Nsum_{n=1}^Nx_n<r$ has probability $0$. Then use symmetry and some measure argument to conclude the beforementioned result.
$endgroup$
– SmileyCraft
Jan 20 at 18:43




$begingroup$
You can probably show for every $r<frac12$ that $limfrac1Nsum_{n=1}^Nx_n<r$ has probability $0$. Then use symmetry and some measure argument to conclude the beforementioned result.
$endgroup$
– SmileyCraft
Jan 20 at 18:43












$begingroup$
When I first attempted this using the modified binomial distribution, I actually found that every event has probability 0, they just converge to zero at different rates. This is a natural consequence of starting from a discrete probability, since the sum of discrete probabilities must equal 1. If each event is one of an infinite number, then it can only be infinitesimally likely, otherwise their sum would exceed 1 by an infinite amount.
$endgroup$
– R. Burton
Jan 20 at 18:59




$begingroup$
When I first attempted this using the modified binomial distribution, I actually found that every event has probability 0, they just converge to zero at different rates. This is a natural consequence of starting from a discrete probability, since the sum of discrete probabilities must equal 1. If each event is one of an infinite number, then it can only be infinitesimally likely, otherwise their sum would exceed 1 by an infinite amount.
$endgroup$
– R. Burton
Jan 20 at 18:59










1 Answer
1






active

oldest

votes


















1












$begingroup$

Set $ X_n = 1 $ if the $ n^mbox{th} $ toss of a fair coin is a head, and $ X_n = 0 $ if it's a tail. The strong law of large numbers then tells us that the sequence $ frac{1}{N}sum_{n=1}^N X_n $ converges to
$ frac{1}{2} $ with probability $ 1 $ as $ Nrightarrowinfty $, just as SmileyCraft surmised.



For large $ N $, the central limit theorem tells us that the distribution of $ frac{2,sum_{n=1}^N X_n -,N}{sqrt{N}} $ is close to the standard normal. This means that the probability of $ frac{sum_{n=1}^N X_n}{N} $ lying within any given interval $ left(a, bright) subset left[0, 1right] $ is well approximated by $ frac{1}{2pi} int_{left(2,a - 1right)sqrt{N}}^{left(2,b - 1right)sqrt{N}}e^{-frac{x^2}{2}} dx $, as long as $ a $ and $ b $ are sufficiently distant from the end points of the interval $ left[0, 1right] $. If $ frac{1}{2}notin left[a, bright] $, then this integral converges to $ 0 $ as $ Nrightarrowinfty $, if $ frac{1}{2}in left(a, bright) $, it converges to $ 1 $, while if either $ a=frac{1}{2} $ or $ b=frac{1}{2} $, it converges to $frac{1}{2} $.



Thus, as $ Nrightarrowinfty $, the weigh of the distribution of $ frac{sum_{n=1}^N X_n}{N} $ becomes more and more concentrated around the value $ frac{1}{2} $, and vanishes from everywhere else.



Reply to query in comments:



The limiting cumulative distribution function, $ F_infty $, is a Heaviside step function, translated a distance $ frac{1}{2} $ to the right:
begin{eqnarray} F_inftyleft(xright) &=& Hleft(x-frac{1}{2}right)\
&=& left{begin{matrix} 0 & mbox{ for } x < frac{1}{2} \
1 & mbox{ for } xgefrac{1}{2} .
end{matrix}right.
end{eqnarray}

A density function does not exist for this distribution—not, at least, unless you're willing to allow a generalised function to do service in that role. That is, there is no ordinary integrable function $ f $ such that $ F_inftyleft(xright) = int_{-infty}^x fleft(tright) dt $. If you are willing to make use of generalised functions, however, then the Dirac delta function, $ delta $, translated a distance $ frac{1}{2} $ to the right, can be thought of as a "density function" for $ F_infty $.






share|cite|improve this answer











$endgroup$













  • $begingroup$
    Would it be possible to create an explicit formula for a PDF $p_n:[0,1]to[0,1]$, for which $delta(x-0.5)=lim_{ntoinfty}p_n(x)$?
    $endgroup$
    – R. Burton
    Jan 21 at 15:35











Your Answer





StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");

StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














draft saved

draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3080066%2fprobability-distribution-for-an-infinite-number-of-fair-coin-flips%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown

























1 Answer
1






active

oldest

votes








1 Answer
1






active

oldest

votes









active

oldest

votes






active

oldest

votes









1












$begingroup$

Set $ X_n = 1 $ if the $ n^mbox{th} $ toss of a fair coin is a head, and $ X_n = 0 $ if it's a tail. The strong law of large numbers then tells us that the sequence $ frac{1}{N}sum_{n=1}^N X_n $ converges to
$ frac{1}{2} $ with probability $ 1 $ as $ Nrightarrowinfty $, just as SmileyCraft surmised.



For large $ N $, the central limit theorem tells us that the distribution of $ frac{2,sum_{n=1}^N X_n -,N}{sqrt{N}} $ is close to the standard normal. This means that the probability of $ frac{sum_{n=1}^N X_n}{N} $ lying within any given interval $ left(a, bright) subset left[0, 1right] $ is well approximated by $ frac{1}{2pi} int_{left(2,a - 1right)sqrt{N}}^{left(2,b - 1right)sqrt{N}}e^{-frac{x^2}{2}} dx $, as long as $ a $ and $ b $ are sufficiently distant from the end points of the interval $ left[0, 1right] $. If $ frac{1}{2}notin left[a, bright] $, then this integral converges to $ 0 $ as $ Nrightarrowinfty $, if $ frac{1}{2}in left(a, bright) $, it converges to $ 1 $, while if either $ a=frac{1}{2} $ or $ b=frac{1}{2} $, it converges to $frac{1}{2} $.



Thus, as $ Nrightarrowinfty $, the weigh of the distribution of $ frac{sum_{n=1}^N X_n}{N} $ becomes more and more concentrated around the value $ frac{1}{2} $, and vanishes from everywhere else.



Reply to query in comments:



The limiting cumulative distribution function, $ F_infty $, is a Heaviside step function, translated a distance $ frac{1}{2} $ to the right:
begin{eqnarray} F_inftyleft(xright) &=& Hleft(x-frac{1}{2}right)\
&=& left{begin{matrix} 0 & mbox{ for } x < frac{1}{2} \
1 & mbox{ for } xgefrac{1}{2} .
end{matrix}right.
end{eqnarray}

A density function does not exist for this distribution—not, at least, unless you're willing to allow a generalised function to do service in that role. That is, there is no ordinary integrable function $ f $ such that $ F_inftyleft(xright) = int_{-infty}^x fleft(tright) dt $. If you are willing to make use of generalised functions, however, then the Dirac delta function, $ delta $, translated a distance $ frac{1}{2} $ to the right, can be thought of as a "density function" for $ F_infty $.






share|cite|improve this answer











$endgroup$













  • $begingroup$
    Would it be possible to create an explicit formula for a PDF $p_n:[0,1]to[0,1]$, for which $delta(x-0.5)=lim_{ntoinfty}p_n(x)$?
    $endgroup$
    – R. Burton
    Jan 21 at 15:35
















1












$begingroup$

Set $ X_n = 1 $ if the $ n^mbox{th} $ toss of a fair coin is a head, and $ X_n = 0 $ if it's a tail. The strong law of large numbers then tells us that the sequence $ frac{1}{N}sum_{n=1}^N X_n $ converges to
$ frac{1}{2} $ with probability $ 1 $ as $ Nrightarrowinfty $, just as SmileyCraft surmised.



For large $ N $, the central limit theorem tells us that the distribution of $ frac{2,sum_{n=1}^N X_n -,N}{sqrt{N}} $ is close to the standard normal. This means that the probability of $ frac{sum_{n=1}^N X_n}{N} $ lying within any given interval $ left(a, bright) subset left[0, 1right] $ is well approximated by $ frac{1}{2pi} int_{left(2,a - 1right)sqrt{N}}^{left(2,b - 1right)sqrt{N}}e^{-frac{x^2}{2}} dx $, as long as $ a $ and $ b $ are sufficiently distant from the end points of the interval $ left[0, 1right] $. If $ frac{1}{2}notin left[a, bright] $, then this integral converges to $ 0 $ as $ Nrightarrowinfty $, if $ frac{1}{2}in left(a, bright) $, it converges to $ 1 $, while if either $ a=frac{1}{2} $ or $ b=frac{1}{2} $, it converges to $frac{1}{2} $.



Thus, as $ Nrightarrowinfty $, the weigh of the distribution of $ frac{sum_{n=1}^N X_n}{N} $ becomes more and more concentrated around the value $ frac{1}{2} $, and vanishes from everywhere else.



Reply to query in comments:



The limiting cumulative distribution function, $ F_infty $, is a Heaviside step function, translated a distance $ frac{1}{2} $ to the right:
begin{eqnarray} F_inftyleft(xright) &=& Hleft(x-frac{1}{2}right)\
&=& left{begin{matrix} 0 & mbox{ for } x < frac{1}{2} \
1 & mbox{ for } xgefrac{1}{2} .
end{matrix}right.
end{eqnarray}

A density function does not exist for this distribution—not, at least, unless you're willing to allow a generalised function to do service in that role. That is, there is no ordinary integrable function $ f $ such that $ F_inftyleft(xright) = int_{-infty}^x fleft(tright) dt $. If you are willing to make use of generalised functions, however, then the Dirac delta function, $ delta $, translated a distance $ frac{1}{2} $ to the right, can be thought of as a "density function" for $ F_infty $.






share|cite|improve this answer











$endgroup$













  • $begingroup$
    Would it be possible to create an explicit formula for a PDF $p_n:[0,1]to[0,1]$, for which $delta(x-0.5)=lim_{ntoinfty}p_n(x)$?
    $endgroup$
    – R. Burton
    Jan 21 at 15:35














1












1








1





$begingroup$

Set $ X_n = 1 $ if the $ n^mbox{th} $ toss of a fair coin is a head, and $ X_n = 0 $ if it's a tail. The strong law of large numbers then tells us that the sequence $ frac{1}{N}sum_{n=1}^N X_n $ converges to
$ frac{1}{2} $ with probability $ 1 $ as $ Nrightarrowinfty $, just as SmileyCraft surmised.



For large $ N $, the central limit theorem tells us that the distribution of $ frac{2,sum_{n=1}^N X_n -,N}{sqrt{N}} $ is close to the standard normal. This means that the probability of $ frac{sum_{n=1}^N X_n}{N} $ lying within any given interval $ left(a, bright) subset left[0, 1right] $ is well approximated by $ frac{1}{2pi} int_{left(2,a - 1right)sqrt{N}}^{left(2,b - 1right)sqrt{N}}e^{-frac{x^2}{2}} dx $, as long as $ a $ and $ b $ are sufficiently distant from the end points of the interval $ left[0, 1right] $. If $ frac{1}{2}notin left[a, bright] $, then this integral converges to $ 0 $ as $ Nrightarrowinfty $, if $ frac{1}{2}in left(a, bright) $, it converges to $ 1 $, while if either $ a=frac{1}{2} $ or $ b=frac{1}{2} $, it converges to $frac{1}{2} $.



Thus, as $ Nrightarrowinfty $, the weigh of the distribution of $ frac{sum_{n=1}^N X_n}{N} $ becomes more and more concentrated around the value $ frac{1}{2} $, and vanishes from everywhere else.



Reply to query in comments:



The limiting cumulative distribution function, $ F_infty $, is a Heaviside step function, translated a distance $ frac{1}{2} $ to the right:
begin{eqnarray} F_inftyleft(xright) &=& Hleft(x-frac{1}{2}right)\
&=& left{begin{matrix} 0 & mbox{ for } x < frac{1}{2} \
1 & mbox{ for } xgefrac{1}{2} .
end{matrix}right.
end{eqnarray}

A density function does not exist for this distribution—not, at least, unless you're willing to allow a generalised function to do service in that role. That is, there is no ordinary integrable function $ f $ such that $ F_inftyleft(xright) = int_{-infty}^x fleft(tright) dt $. If you are willing to make use of generalised functions, however, then the Dirac delta function, $ delta $, translated a distance $ frac{1}{2} $ to the right, can be thought of as a "density function" for $ F_infty $.






share|cite|improve this answer











$endgroup$



Set $ X_n = 1 $ if the $ n^mbox{th} $ toss of a fair coin is a head, and $ X_n = 0 $ if it's a tail. The strong law of large numbers then tells us that the sequence $ frac{1}{N}sum_{n=1}^N X_n $ converges to
$ frac{1}{2} $ with probability $ 1 $ as $ Nrightarrowinfty $, just as SmileyCraft surmised.



For large $ N $, the central limit theorem tells us that the distribution of $ frac{2,sum_{n=1}^N X_n -,N}{sqrt{N}} $ is close to the standard normal. This means that the probability of $ frac{sum_{n=1}^N X_n}{N} $ lying within any given interval $ left(a, bright) subset left[0, 1right] $ is well approximated by $ frac{1}{2pi} int_{left(2,a - 1right)sqrt{N}}^{left(2,b - 1right)sqrt{N}}e^{-frac{x^2}{2}} dx $, as long as $ a $ and $ b $ are sufficiently distant from the end points of the interval $ left[0, 1right] $. If $ frac{1}{2}notin left[a, bright] $, then this integral converges to $ 0 $ as $ Nrightarrowinfty $, if $ frac{1}{2}in left(a, bright) $, it converges to $ 1 $, while if either $ a=frac{1}{2} $ or $ b=frac{1}{2} $, it converges to $frac{1}{2} $.



Thus, as $ Nrightarrowinfty $, the weigh of the distribution of $ frac{sum_{n=1}^N X_n}{N} $ becomes more and more concentrated around the value $ frac{1}{2} $, and vanishes from everywhere else.



Reply to query in comments:



The limiting cumulative distribution function, $ F_infty $, is a Heaviside step function, translated a distance $ frac{1}{2} $ to the right:
begin{eqnarray} F_inftyleft(xright) &=& Hleft(x-frac{1}{2}right)\
&=& left{begin{matrix} 0 & mbox{ for } x < frac{1}{2} \
1 & mbox{ for } xgefrac{1}{2} .
end{matrix}right.
end{eqnarray}

A density function does not exist for this distribution—not, at least, unless you're willing to allow a generalised function to do service in that role. That is, there is no ordinary integrable function $ f $ such that $ F_inftyleft(xright) = int_{-infty}^x fleft(tright) dt $. If you are willing to make use of generalised functions, however, then the Dirac delta function, $ delta $, translated a distance $ frac{1}{2} $ to the right, can be thought of as a "density function" for $ F_infty $.







share|cite|improve this answer














share|cite|improve this answer



share|cite|improve this answer








edited Jan 22 at 0:32

























answered Jan 21 at 13:38









lonza leggieralonza leggiera

74817




74817












  • $begingroup$
    Would it be possible to create an explicit formula for a PDF $p_n:[0,1]to[0,1]$, for which $delta(x-0.5)=lim_{ntoinfty}p_n(x)$?
    $endgroup$
    – R. Burton
    Jan 21 at 15:35


















  • $begingroup$
    Would it be possible to create an explicit formula for a PDF $p_n:[0,1]to[0,1]$, for which $delta(x-0.5)=lim_{ntoinfty}p_n(x)$?
    $endgroup$
    – R. Burton
    Jan 21 at 15:35
















$begingroup$
Would it be possible to create an explicit formula for a PDF $p_n:[0,1]to[0,1]$, for which $delta(x-0.5)=lim_{ntoinfty}p_n(x)$?
$endgroup$
– R. Burton
Jan 21 at 15:35




$begingroup$
Would it be possible to create an explicit formula for a PDF $p_n:[0,1]to[0,1]$, for which $delta(x-0.5)=lim_{ntoinfty}p_n(x)$?
$endgroup$
– R. Burton
Jan 21 at 15:35


















draft saved

draft discarded




















































Thanks for contributing an answer to Mathematics Stack Exchange!


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


Use MathJax to format equations. MathJax reference.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3080066%2fprobability-distribution-for-an-infinite-number-of-fair-coin-flips%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

Mario Kart Wii

What does “Dominus providebit” mean?

Antonio Litta Visconti Arese