Gambling game: What is the probability of eventually going broke












11












$begingroup$


You find 2 dollars in your pocket and decide to go gambling. Fortunately, the game you're playing has very favourable odds: each time you play, you gain 1 dollar with probability 3/4 and lose $1 with probability 1/4.



Suppose you continue playing so long as you have money in your pocket. If you lose your first two bets, you go broke and go home after only two rounds; but if you win forever, you'll play forever.



What's the probability you'll eventually go broke?



I think I am overthinking this:



To go broke, you have to end on two Losses, which have p(L) = 1/4



Before the two losses there has to have been an even number of losses and win to have a balance of 2 dollars before the final two losses.



$P(Broke)=left( dfrac {1}{4}right) ^{2}sum ^{infty }_{i=0}left( dfrac {1}{4}right) ^{i}left( dfrac {3}{4}right) ^{i}N_{i}$



But then for each summand I need to multiply by the number of ways it is possible to get to an equal number of losses is without going broke beforehand. so for i = 1 LW and WL, are allowed, for i = 2 WLLW , WLWL, LWWL, LWLW, LLWW are allowed.



So my final calculatin is $P(Broke) = left( dfrac {1}{4}right) ^{2}sum ^{infty }_{i=0}left( dfrac {1}{4}right) ^{i}left( dfrac {3}{4}right) ^{i}left( begin{pmatrix} 2i \ i end{pmatrix}-sum ^{i-1}_{k=1}begin{pmatrix} 2k \ n-1 end{pmatrix}right) $



But this doesn't equal anything and seems rather complicated.










share|cite|improve this question









$endgroup$












  • $begingroup$
    It seems actually similar to math.stackexchange.com/questions/153123/… (even if it goes against my other comment... seems I got wrong [again!] in probability problems)
    $endgroup$
    – Xenos
    Jun 12 '18 at 8:45


















11












$begingroup$


You find 2 dollars in your pocket and decide to go gambling. Fortunately, the game you're playing has very favourable odds: each time you play, you gain 1 dollar with probability 3/4 and lose $1 with probability 1/4.



Suppose you continue playing so long as you have money in your pocket. If you lose your first two bets, you go broke and go home after only two rounds; but if you win forever, you'll play forever.



What's the probability you'll eventually go broke?



I think I am overthinking this:



To go broke, you have to end on two Losses, which have p(L) = 1/4



Before the two losses there has to have been an even number of losses and win to have a balance of 2 dollars before the final two losses.



$P(Broke)=left( dfrac {1}{4}right) ^{2}sum ^{infty }_{i=0}left( dfrac {1}{4}right) ^{i}left( dfrac {3}{4}right) ^{i}N_{i}$



But then for each summand I need to multiply by the number of ways it is possible to get to an equal number of losses is without going broke beforehand. so for i = 1 LW and WL, are allowed, for i = 2 WLLW , WLWL, LWWL, LWLW, LLWW are allowed.



So my final calculatin is $P(Broke) = left( dfrac {1}{4}right) ^{2}sum ^{infty }_{i=0}left( dfrac {1}{4}right) ^{i}left( dfrac {3}{4}right) ^{i}left( begin{pmatrix} 2i \ i end{pmatrix}-sum ^{i-1}_{k=1}begin{pmatrix} 2k \ n-1 end{pmatrix}right) $



But this doesn't equal anything and seems rather complicated.










share|cite|improve this question









$endgroup$












  • $begingroup$
    It seems actually similar to math.stackexchange.com/questions/153123/… (even if it goes against my other comment... seems I got wrong [again!] in probability problems)
    $endgroup$
    – Xenos
    Jun 12 '18 at 8:45
















11












11








11


5



$begingroup$


You find 2 dollars in your pocket and decide to go gambling. Fortunately, the game you're playing has very favourable odds: each time you play, you gain 1 dollar with probability 3/4 and lose $1 with probability 1/4.



Suppose you continue playing so long as you have money in your pocket. If you lose your first two bets, you go broke and go home after only two rounds; but if you win forever, you'll play forever.



What's the probability you'll eventually go broke?



I think I am overthinking this:



To go broke, you have to end on two Losses, which have p(L) = 1/4



Before the two losses there has to have been an even number of losses and win to have a balance of 2 dollars before the final two losses.



$P(Broke)=left( dfrac {1}{4}right) ^{2}sum ^{infty }_{i=0}left( dfrac {1}{4}right) ^{i}left( dfrac {3}{4}right) ^{i}N_{i}$



But then for each summand I need to multiply by the number of ways it is possible to get to an equal number of losses is without going broke beforehand. so for i = 1 LW and WL, are allowed, for i = 2 WLLW , WLWL, LWWL, LWLW, LLWW are allowed.



So my final calculatin is $P(Broke) = left( dfrac {1}{4}right) ^{2}sum ^{infty }_{i=0}left( dfrac {1}{4}right) ^{i}left( dfrac {3}{4}right) ^{i}left( begin{pmatrix} 2i \ i end{pmatrix}-sum ^{i-1}_{k=1}begin{pmatrix} 2k \ n-1 end{pmatrix}right) $



But this doesn't equal anything and seems rather complicated.










share|cite|improve this question









$endgroup$




You find 2 dollars in your pocket and decide to go gambling. Fortunately, the game you're playing has very favourable odds: each time you play, you gain 1 dollar with probability 3/4 and lose $1 with probability 1/4.



Suppose you continue playing so long as you have money in your pocket. If you lose your first two bets, you go broke and go home after only two rounds; but if you win forever, you'll play forever.



What's the probability you'll eventually go broke?



I think I am overthinking this:



To go broke, you have to end on two Losses, which have p(L) = 1/4



Before the two losses there has to have been an even number of losses and win to have a balance of 2 dollars before the final two losses.



$P(Broke)=left( dfrac {1}{4}right) ^{2}sum ^{infty }_{i=0}left( dfrac {1}{4}right) ^{i}left( dfrac {3}{4}right) ^{i}N_{i}$



But then for each summand I need to multiply by the number of ways it is possible to get to an equal number of losses is without going broke beforehand. so for i = 1 LW and WL, are allowed, for i = 2 WLLW , WLWL, LWWL, LWLW, LLWW are allowed.



So my final calculatin is $P(Broke) = left( dfrac {1}{4}right) ^{2}sum ^{infty }_{i=0}left( dfrac {1}{4}right) ^{i}left( dfrac {3}{4}right) ^{i}left( begin{pmatrix} 2i \ i end{pmatrix}-sum ^{i-1}_{k=1}begin{pmatrix} 2k \ n-1 end{pmatrix}right) $



But this doesn't equal anything and seems rather complicated.







probability algebra-precalculus






share|cite|improve this question













share|cite|improve this question











share|cite|improve this question




share|cite|improve this question










asked Jun 11 '18 at 14:00









TinatimTinatim

358110




358110












  • $begingroup$
    It seems actually similar to math.stackexchange.com/questions/153123/… (even if it goes against my other comment... seems I got wrong [again!] in probability problems)
    $endgroup$
    – Xenos
    Jun 12 '18 at 8:45




















  • $begingroup$
    It seems actually similar to math.stackexchange.com/questions/153123/… (even if it goes against my other comment... seems I got wrong [again!] in probability problems)
    $endgroup$
    – Xenos
    Jun 12 '18 at 8:45


















$begingroup$
It seems actually similar to math.stackexchange.com/questions/153123/… (even if it goes against my other comment... seems I got wrong [again!] in probability problems)
$endgroup$
– Xenos
Jun 12 '18 at 8:45






$begingroup$
It seems actually similar to math.stackexchange.com/questions/153123/… (even if it goes against my other comment... seems I got wrong [again!] in probability problems)
$endgroup$
– Xenos
Jun 12 '18 at 8:45












4 Answers
4






active

oldest

votes


















7












$begingroup$

Let $p(n)$ be the probability of going broke starting with $n$ dollars. On the one hand, we have
$$
p(n)=p(1)^ntag1
$$
because moving from $$n$ to $$0$ is the same as moving from $$k$ to $$k-1$ a total of $n$ times, for $k=n,n-1,n-2,dots,1$. On the other hand, we must have
$$
p(1)=frac34 p(2)+frac14qquad (nge1)tag2
$$
Substituting $(1)$ into $(2)$, and setting $n=1$, you get $p(1)=frac34 p(1)^{2}+frac14$, the solution to which is $p(1)=1/3$ or $p(1)=1$. If we had $p(1)=1$, then we would have $p(n)=1$ for all $n$, so that you certainly go broke no matter how much money you start with, which is clearly incorrect.$^*$ Therefore,
$$
bbox[3pt,border: 1.5pt black solid]{p(n)=(1/3)^n.}
$$
$^*$A rigorous proof of this certainly exists, perhaps by applying Stirling's approximation to an exact combinatorial expression for $p(n)$, or with some central limit theorem argument.



Edit: For a more convincing argument, suppose that the process does not stop when you hit zero. If $p(1)=1$, you would certainly hit $$0$, and from there certainly hit $$(-1)$, and from there $$(-2)$, and so on. This means you would drift arbitrarily far down. This contradicts the law of large numbers, which says $S_n/n$ converges to $1/2$ almost surely as $ntoinfty$, where $S_n$ is your winnings after $n$ plays.






share|cite|improve this answer











$endgroup$









  • 1




    $begingroup$
    Saying we play forever and considering the odds, there will be a consecutive sequence of N loosing bets for any N, including N = current amount of dollars in my pocket (which is finite)? So I'll always end up broken and P(broke) = 1? Note that games odds would then change nothing (except if you win every time)...
    $endgroup$
    – Xenos
    Jun 11 '18 at 17:00












  • $begingroup$
    The claim that if P(A) converges to one, and P(B|A) converges to one, then P(B) converges to one, is not trivial, and I'm not even sure it's true.
    $endgroup$
    – Acccumulation
    Jun 11 '18 at 17:50






  • 2




    $begingroup$
    @Xenos This is not correct. An infinite sequence of independent events with the same probability will eventually occur. However, the event "lose N times in a row," where N is your current wealth, has a changing probability. What happens is that as time goes on, your wealth N generally grows, so P(N losses) goes to zero fast enough that the infinite product of $(1-P(Ntext{ losses}))$ is nonzero.
    $endgroup$
    – Mike Earnest
    Jun 11 '18 at 18:04










  • $begingroup$
    I don't think I understand your argument in the edit. $p(1)=1$ does not mean "you always lose money", it means "you eventually reach zero".
    $endgroup$
    – Raeven0
    Jun 11 '18 at 18:09






  • 1




    $begingroup$
    @Raeven0 You can think of $p(1)$ as the probability of (eventually) moving from $1 to $0. This the same as the probability of moving from $k to $(k-1), for any k, by the symmetry of this process. I know that $p(1)$ was defined as the probability of reaching zero (starting from 1), but a consequence of this definition is that it is also the probability of losing a dollar starting from any amount.
    $endgroup$
    – Mike Earnest
    Jun 11 '18 at 18:14



















3












$begingroup$

As another possible method. Consider a slightly modified game where the game can end if the money reaches $0$ or $N$ dollars (we can take the limit later). Let $P_i$ be the probability that you win starting with $i$ dollars. Suppose the probability of winning a single game is $p$ and $q = 1- p$ is the probability of losing a single game. We can observe the relation
$$ P_i = p P_{i+1} + q P_{i-1}. tag{1} label{1}$$
This is a linear difference equation which can easily be solved for roots (via $P_i = z^i$). The result is
$$P_i = begin{cases} A + B left(frac{q}{p} right)^i & text{if } p neq q, \ A + B i & text{if } p = q = 1/2. end{cases}$$
Then noting the boundary conditions $P_0 = 0$ and $P_N = 1$, we obtain,
$$P_i = begin{cases} frac{1-left(frac{q}{p} right)^i}{1- left(frac{q}{p} right)^N} & text{if } p neq q, \ frac{i}{N} & text{if } p = q = 1/2. end{cases}$$
Taking the limit as $N to infty$, yields the desired result.






share|cite|improve this answer









$endgroup$









  • 2




    $begingroup$
    I agree that it's plausible the $N to infty$ limit yields the desired result, but I want to emphasize that has not yet been argued that the probability of winning the OP's game should be related to the limiting probability of winning the $(0,N)$ game.
    $endgroup$
    – Hurkyl
    Jun 11 '18 at 23:35












  • $begingroup$
    Ok, This is neat, and the first answer, I have looked at. I see that you have reduced the problem to gambler's ruin (en.wikipedia.org/wiki/Gambler%27s_ruin). I have no experience solving linear difference equations, is the standard method what is suggested here (cl.cam.ac.uk/teaching/2003/Probability/prob07.pdf), try 1) $P_{i}=Az^{i}$, 2) take i = 2 case, 3) Solve, make linear combination as a guess 4) Sub in guess to original to check it is right
    $endgroup$
    – Tinatim
    Jun 12 '18 at 8:22












  • $begingroup$
    @Hurkyl, Is it not the same problem exactly, with N tending to infinity?
    $endgroup$
    – Tinatim
    Jun 12 '18 at 8:24










  • $begingroup$
    @Tinatim: You'll have to tell me what it means to play this game "with N tending to infinity", and why the probability of winning that game is the limit of the probability of winning the $(0,N)$ games. If you meant that the original problem is the limit as N tends to infinity... you'll have to explain what it means to take the limit of a sequence of games and why the probability of winning should be a continuous function of the game.
    $endgroup$
    – Hurkyl
    Jun 12 '18 at 8:48








  • 1




    $begingroup$
    @Tinatim: ... for example, one way to make sense of a limiting idea here is that events leading to winning the game in the OP has the property that, for every $N$, it starts with a sequence of events that describes a winning $(0,N)$-game. Why is the probability of winning the OP game the limit of the $(0,N)$-games? A limit-like argument can probably be made in this case, but it the situation is not so clear that it goes without saying. I emphasize this precisely because people gloss over this sort of issue without even recognizing it can be a problem.
    $endgroup$
    – Hurkyl
    Jun 12 '18 at 8:53





















0












$begingroup$

Unfortuneately i don't have a very intuitive understanding of the problem, but i thought about it before and that's what i came up with:



Denote $S_n$ be the your money in step $n$ and $S_0=x$ the money you start with. Let $y$ be the (for the time) finite amount of money the bank starts with.
Let $p$ be the probabilty you win $1$$ and $(1-p)$ the probabily that you lose $1$$.



Notice that $left(frac{1-p}{p}right)^{S_n}$ is a martingale, which means
$$mathbb{E}left[left(frac{1-p}{p}right)^{S_n} huge|normalsize left(frac{1-p}{p}right)^{S_k} right] = left(frac{1-p}{p}right)^{S_k}$$
for $kleq n$.



Let $tau = operatorname{inf}{k: S_ntext{ is constant for all }ngeq k }$, that is let $tau$ be the time the game ends (you won all the money from your opponent or you lost all your money).



By the optional stopping theorem,



$$mathbb{P}left(S_tau = 0 right) left(frac{1-p}{p}right)^{0} + left(1- mathbb{P}left(S_tau = 0 right) right)left(frac{1-p}{p}right)^{x+y} = mathbb{E}left[left(frac{1-p}{p}right)^{S_tau}right] = mathbb{E}left[left(frac{1-p}{p}right)^{S_0}right] = left(frac{1-p}{p}right)^{x},$$
so
$$mathbb{P}left(S_tau = 0 right) = frac{left(frac{1-p}{p}right)^{x} - left(frac{1-p}{p}right)^{x+y}}{1 - left(frac{1-p}{p}right)^{x+y}}.$$



For $(1-p) < p$ we see, by letting $yrightarrow infty$, that for a bank with unlimited resources, we get
$$mathbb{P}left(S_tau = 0 right) = left(frac{1-p}{p}right)^{x}.$$






share|cite|improve this answer









$endgroup$









  • 1




    $begingroup$
    I agree that it's plausible the $y to infty$ limit yields the desired result, but I want to emphasize that has not yet been argued that the probability of winning the OP's game should be related to the limiting probability of winning the bounded version of the game.
    $endgroup$
    – Hurkyl
    Jun 11 '18 at 23:40










  • $begingroup$
    @Hurkyl That is true and im not sure how to do that.
    $endgroup$
    – cdwe
    Jun 11 '18 at 23:44



















0












$begingroup$

Let $p(n)$ be the probability of ever reaching $n-1$ coins (dollars) starting with $n$ coins. We have that



$$p(n) = 1/4 + 3/4 cdot p(n+1) cdot p(n)$$



Since we can either reach $n-1$ coins directly by losing or win one coin then reach $n$ coins with probability $p(n+1)$ and then reach $n-1$ coins with probability $p(n)$.



But we can see that $p(n+1) = p(n)$, because it's the same the probability of eventually losing one coin, it doesn't matter if we have $n$ or $n+1$ coins to begin with. Substituting, we have



$$p(n) = 1/4 + 3/4 cdot p(n)^2$$



Solving this quadratic equation we have $p(n) = 1/3$ or $p(n) = 1$. We can ignore the case where $p(n) = 1$ since the probability of losing is not 1.



Finally, the probability of losing is



$$p(2) * p(1) = (1/3)^2 = 1/9$$






share|cite|improve this answer









$endgroup$













    Your Answer





    StackExchange.ifUsing("editor", function () {
    return StackExchange.using("mathjaxEditing", function () {
    StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
    StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
    });
    });
    }, "mathjax-editing");

    StackExchange.ready(function() {
    var channelOptions = {
    tags: "".split(" "),
    id: "69"
    };
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function() {
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled) {
    StackExchange.using("snippets", function() {
    createEditor();
    });
    }
    else {
    createEditor();
    }
    });

    function createEditor() {
    StackExchange.prepareEditor({
    heartbeatType: 'answer',
    autoActivateHeartbeat: false,
    convertImagesToLinks: true,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    imageUploader: {
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    },
    noCode: true, onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    });


    }
    });














    draft saved

    draft discarded


















    StackExchange.ready(
    function () {
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2815752%2fgambling-game-what-is-the-probability-of-eventually-going-broke%23new-answer', 'question_page');
    }
    );

    Post as a guest















    Required, but never shown

























    4 Answers
    4






    active

    oldest

    votes








    4 Answers
    4






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    7












    $begingroup$

    Let $p(n)$ be the probability of going broke starting with $n$ dollars. On the one hand, we have
    $$
    p(n)=p(1)^ntag1
    $$
    because moving from $$n$ to $$0$ is the same as moving from $$k$ to $$k-1$ a total of $n$ times, for $k=n,n-1,n-2,dots,1$. On the other hand, we must have
    $$
    p(1)=frac34 p(2)+frac14qquad (nge1)tag2
    $$
    Substituting $(1)$ into $(2)$, and setting $n=1$, you get $p(1)=frac34 p(1)^{2}+frac14$, the solution to which is $p(1)=1/3$ or $p(1)=1$. If we had $p(1)=1$, then we would have $p(n)=1$ for all $n$, so that you certainly go broke no matter how much money you start with, which is clearly incorrect.$^*$ Therefore,
    $$
    bbox[3pt,border: 1.5pt black solid]{p(n)=(1/3)^n.}
    $$
    $^*$A rigorous proof of this certainly exists, perhaps by applying Stirling's approximation to an exact combinatorial expression for $p(n)$, or with some central limit theorem argument.



    Edit: For a more convincing argument, suppose that the process does not stop when you hit zero. If $p(1)=1$, you would certainly hit $$0$, and from there certainly hit $$(-1)$, and from there $$(-2)$, and so on. This means you would drift arbitrarily far down. This contradicts the law of large numbers, which says $S_n/n$ converges to $1/2$ almost surely as $ntoinfty$, where $S_n$ is your winnings after $n$ plays.






    share|cite|improve this answer











    $endgroup$









    • 1




      $begingroup$
      Saying we play forever and considering the odds, there will be a consecutive sequence of N loosing bets for any N, including N = current amount of dollars in my pocket (which is finite)? So I'll always end up broken and P(broke) = 1? Note that games odds would then change nothing (except if you win every time)...
      $endgroup$
      – Xenos
      Jun 11 '18 at 17:00












    • $begingroup$
      The claim that if P(A) converges to one, and P(B|A) converges to one, then P(B) converges to one, is not trivial, and I'm not even sure it's true.
      $endgroup$
      – Acccumulation
      Jun 11 '18 at 17:50






    • 2




      $begingroup$
      @Xenos This is not correct. An infinite sequence of independent events with the same probability will eventually occur. However, the event "lose N times in a row," where N is your current wealth, has a changing probability. What happens is that as time goes on, your wealth N generally grows, so P(N losses) goes to zero fast enough that the infinite product of $(1-P(Ntext{ losses}))$ is nonzero.
      $endgroup$
      – Mike Earnest
      Jun 11 '18 at 18:04










    • $begingroup$
      I don't think I understand your argument in the edit. $p(1)=1$ does not mean "you always lose money", it means "you eventually reach zero".
      $endgroup$
      – Raeven0
      Jun 11 '18 at 18:09






    • 1




      $begingroup$
      @Raeven0 You can think of $p(1)$ as the probability of (eventually) moving from $1 to $0. This the same as the probability of moving from $k to $(k-1), for any k, by the symmetry of this process. I know that $p(1)$ was defined as the probability of reaching zero (starting from 1), but a consequence of this definition is that it is also the probability of losing a dollar starting from any amount.
      $endgroup$
      – Mike Earnest
      Jun 11 '18 at 18:14
















    7












    $begingroup$

    Let $p(n)$ be the probability of going broke starting with $n$ dollars. On the one hand, we have
    $$
    p(n)=p(1)^ntag1
    $$
    because moving from $$n$ to $$0$ is the same as moving from $$k$ to $$k-1$ a total of $n$ times, for $k=n,n-1,n-2,dots,1$. On the other hand, we must have
    $$
    p(1)=frac34 p(2)+frac14qquad (nge1)tag2
    $$
    Substituting $(1)$ into $(2)$, and setting $n=1$, you get $p(1)=frac34 p(1)^{2}+frac14$, the solution to which is $p(1)=1/3$ or $p(1)=1$. If we had $p(1)=1$, then we would have $p(n)=1$ for all $n$, so that you certainly go broke no matter how much money you start with, which is clearly incorrect.$^*$ Therefore,
    $$
    bbox[3pt,border: 1.5pt black solid]{p(n)=(1/3)^n.}
    $$
    $^*$A rigorous proof of this certainly exists, perhaps by applying Stirling's approximation to an exact combinatorial expression for $p(n)$, or with some central limit theorem argument.



    Edit: For a more convincing argument, suppose that the process does not stop when you hit zero. If $p(1)=1$, you would certainly hit $$0$, and from there certainly hit $$(-1)$, and from there $$(-2)$, and so on. This means you would drift arbitrarily far down. This contradicts the law of large numbers, which says $S_n/n$ converges to $1/2$ almost surely as $ntoinfty$, where $S_n$ is your winnings after $n$ plays.






    share|cite|improve this answer











    $endgroup$









    • 1




      $begingroup$
      Saying we play forever and considering the odds, there will be a consecutive sequence of N loosing bets for any N, including N = current amount of dollars in my pocket (which is finite)? So I'll always end up broken and P(broke) = 1? Note that games odds would then change nothing (except if you win every time)...
      $endgroup$
      – Xenos
      Jun 11 '18 at 17:00












    • $begingroup$
      The claim that if P(A) converges to one, and P(B|A) converges to one, then P(B) converges to one, is not trivial, and I'm not even sure it's true.
      $endgroup$
      – Acccumulation
      Jun 11 '18 at 17:50






    • 2




      $begingroup$
      @Xenos This is not correct. An infinite sequence of independent events with the same probability will eventually occur. However, the event "lose N times in a row," where N is your current wealth, has a changing probability. What happens is that as time goes on, your wealth N generally grows, so P(N losses) goes to zero fast enough that the infinite product of $(1-P(Ntext{ losses}))$ is nonzero.
      $endgroup$
      – Mike Earnest
      Jun 11 '18 at 18:04










    • $begingroup$
      I don't think I understand your argument in the edit. $p(1)=1$ does not mean "you always lose money", it means "you eventually reach zero".
      $endgroup$
      – Raeven0
      Jun 11 '18 at 18:09






    • 1




      $begingroup$
      @Raeven0 You can think of $p(1)$ as the probability of (eventually) moving from $1 to $0. This the same as the probability of moving from $k to $(k-1), for any k, by the symmetry of this process. I know that $p(1)$ was defined as the probability of reaching zero (starting from 1), but a consequence of this definition is that it is also the probability of losing a dollar starting from any amount.
      $endgroup$
      – Mike Earnest
      Jun 11 '18 at 18:14














    7












    7








    7





    $begingroup$

    Let $p(n)$ be the probability of going broke starting with $n$ dollars. On the one hand, we have
    $$
    p(n)=p(1)^ntag1
    $$
    because moving from $$n$ to $$0$ is the same as moving from $$k$ to $$k-1$ a total of $n$ times, for $k=n,n-1,n-2,dots,1$. On the other hand, we must have
    $$
    p(1)=frac34 p(2)+frac14qquad (nge1)tag2
    $$
    Substituting $(1)$ into $(2)$, and setting $n=1$, you get $p(1)=frac34 p(1)^{2}+frac14$, the solution to which is $p(1)=1/3$ or $p(1)=1$. If we had $p(1)=1$, then we would have $p(n)=1$ for all $n$, so that you certainly go broke no matter how much money you start with, which is clearly incorrect.$^*$ Therefore,
    $$
    bbox[3pt,border: 1.5pt black solid]{p(n)=(1/3)^n.}
    $$
    $^*$A rigorous proof of this certainly exists, perhaps by applying Stirling's approximation to an exact combinatorial expression for $p(n)$, or with some central limit theorem argument.



    Edit: For a more convincing argument, suppose that the process does not stop when you hit zero. If $p(1)=1$, you would certainly hit $$0$, and from there certainly hit $$(-1)$, and from there $$(-2)$, and so on. This means you would drift arbitrarily far down. This contradicts the law of large numbers, which says $S_n/n$ converges to $1/2$ almost surely as $ntoinfty$, where $S_n$ is your winnings after $n$ plays.






    share|cite|improve this answer











    $endgroup$



    Let $p(n)$ be the probability of going broke starting with $n$ dollars. On the one hand, we have
    $$
    p(n)=p(1)^ntag1
    $$
    because moving from $$n$ to $$0$ is the same as moving from $$k$ to $$k-1$ a total of $n$ times, for $k=n,n-1,n-2,dots,1$. On the other hand, we must have
    $$
    p(1)=frac34 p(2)+frac14qquad (nge1)tag2
    $$
    Substituting $(1)$ into $(2)$, and setting $n=1$, you get $p(1)=frac34 p(1)^{2}+frac14$, the solution to which is $p(1)=1/3$ or $p(1)=1$. If we had $p(1)=1$, then we would have $p(n)=1$ for all $n$, so that you certainly go broke no matter how much money you start with, which is clearly incorrect.$^*$ Therefore,
    $$
    bbox[3pt,border: 1.5pt black solid]{p(n)=(1/3)^n.}
    $$
    $^*$A rigorous proof of this certainly exists, perhaps by applying Stirling's approximation to an exact combinatorial expression for $p(n)$, or with some central limit theorem argument.



    Edit: For a more convincing argument, suppose that the process does not stop when you hit zero. If $p(1)=1$, you would certainly hit $$0$, and from there certainly hit $$(-1)$, and from there $$(-2)$, and so on. This means you would drift arbitrarily far down. This contradicts the law of large numbers, which says $S_n/n$ converges to $1/2$ almost surely as $ntoinfty$, where $S_n$ is your winnings after $n$ plays.







    share|cite|improve this answer














    share|cite|improve this answer



    share|cite|improve this answer








    edited Jun 11 '18 at 16:46

























    answered Jun 11 '18 at 15:54









    Mike EarnestMike Earnest

    22.4k12051




    22.4k12051








    • 1




      $begingroup$
      Saying we play forever and considering the odds, there will be a consecutive sequence of N loosing bets for any N, including N = current amount of dollars in my pocket (which is finite)? So I'll always end up broken and P(broke) = 1? Note that games odds would then change nothing (except if you win every time)...
      $endgroup$
      – Xenos
      Jun 11 '18 at 17:00












    • $begingroup$
      The claim that if P(A) converges to one, and P(B|A) converges to one, then P(B) converges to one, is not trivial, and I'm not even sure it's true.
      $endgroup$
      – Acccumulation
      Jun 11 '18 at 17:50






    • 2




      $begingroup$
      @Xenos This is not correct. An infinite sequence of independent events with the same probability will eventually occur. However, the event "lose N times in a row," where N is your current wealth, has a changing probability. What happens is that as time goes on, your wealth N generally grows, so P(N losses) goes to zero fast enough that the infinite product of $(1-P(Ntext{ losses}))$ is nonzero.
      $endgroup$
      – Mike Earnest
      Jun 11 '18 at 18:04










    • $begingroup$
      I don't think I understand your argument in the edit. $p(1)=1$ does not mean "you always lose money", it means "you eventually reach zero".
      $endgroup$
      – Raeven0
      Jun 11 '18 at 18:09






    • 1




      $begingroup$
      @Raeven0 You can think of $p(1)$ as the probability of (eventually) moving from $1 to $0. This the same as the probability of moving from $k to $(k-1), for any k, by the symmetry of this process. I know that $p(1)$ was defined as the probability of reaching zero (starting from 1), but a consequence of this definition is that it is also the probability of losing a dollar starting from any amount.
      $endgroup$
      – Mike Earnest
      Jun 11 '18 at 18:14














    • 1




      $begingroup$
      Saying we play forever and considering the odds, there will be a consecutive sequence of N loosing bets for any N, including N = current amount of dollars in my pocket (which is finite)? So I'll always end up broken and P(broke) = 1? Note that games odds would then change nothing (except if you win every time)...
      $endgroup$
      – Xenos
      Jun 11 '18 at 17:00












    • $begingroup$
      The claim that if P(A) converges to one, and P(B|A) converges to one, then P(B) converges to one, is not trivial, and I'm not even sure it's true.
      $endgroup$
      – Acccumulation
      Jun 11 '18 at 17:50






    • 2




      $begingroup$
      @Xenos This is not correct. An infinite sequence of independent events with the same probability will eventually occur. However, the event "lose N times in a row," where N is your current wealth, has a changing probability. What happens is that as time goes on, your wealth N generally grows, so P(N losses) goes to zero fast enough that the infinite product of $(1-P(Ntext{ losses}))$ is nonzero.
      $endgroup$
      – Mike Earnest
      Jun 11 '18 at 18:04










    • $begingroup$
      I don't think I understand your argument in the edit. $p(1)=1$ does not mean "you always lose money", it means "you eventually reach zero".
      $endgroup$
      – Raeven0
      Jun 11 '18 at 18:09






    • 1




      $begingroup$
      @Raeven0 You can think of $p(1)$ as the probability of (eventually) moving from $1 to $0. This the same as the probability of moving from $k to $(k-1), for any k, by the symmetry of this process. I know that $p(1)$ was defined as the probability of reaching zero (starting from 1), but a consequence of this definition is that it is also the probability of losing a dollar starting from any amount.
      $endgroup$
      – Mike Earnest
      Jun 11 '18 at 18:14








    1




    1




    $begingroup$
    Saying we play forever and considering the odds, there will be a consecutive sequence of N loosing bets for any N, including N = current amount of dollars in my pocket (which is finite)? So I'll always end up broken and P(broke) = 1? Note that games odds would then change nothing (except if you win every time)...
    $endgroup$
    – Xenos
    Jun 11 '18 at 17:00






    $begingroup$
    Saying we play forever and considering the odds, there will be a consecutive sequence of N loosing bets for any N, including N = current amount of dollars in my pocket (which is finite)? So I'll always end up broken and P(broke) = 1? Note that games odds would then change nothing (except if you win every time)...
    $endgroup$
    – Xenos
    Jun 11 '18 at 17:00














    $begingroup$
    The claim that if P(A) converges to one, and P(B|A) converges to one, then P(B) converges to one, is not trivial, and I'm not even sure it's true.
    $endgroup$
    – Acccumulation
    Jun 11 '18 at 17:50




    $begingroup$
    The claim that if P(A) converges to one, and P(B|A) converges to one, then P(B) converges to one, is not trivial, and I'm not even sure it's true.
    $endgroup$
    – Acccumulation
    Jun 11 '18 at 17:50




    2




    2




    $begingroup$
    @Xenos This is not correct. An infinite sequence of independent events with the same probability will eventually occur. However, the event "lose N times in a row," where N is your current wealth, has a changing probability. What happens is that as time goes on, your wealth N generally grows, so P(N losses) goes to zero fast enough that the infinite product of $(1-P(Ntext{ losses}))$ is nonzero.
    $endgroup$
    – Mike Earnest
    Jun 11 '18 at 18:04




    $begingroup$
    @Xenos This is not correct. An infinite sequence of independent events with the same probability will eventually occur. However, the event "lose N times in a row," where N is your current wealth, has a changing probability. What happens is that as time goes on, your wealth N generally grows, so P(N losses) goes to zero fast enough that the infinite product of $(1-P(Ntext{ losses}))$ is nonzero.
    $endgroup$
    – Mike Earnest
    Jun 11 '18 at 18:04












    $begingroup$
    I don't think I understand your argument in the edit. $p(1)=1$ does not mean "you always lose money", it means "you eventually reach zero".
    $endgroup$
    – Raeven0
    Jun 11 '18 at 18:09




    $begingroup$
    I don't think I understand your argument in the edit. $p(1)=1$ does not mean "you always lose money", it means "you eventually reach zero".
    $endgroup$
    – Raeven0
    Jun 11 '18 at 18:09




    1




    1




    $begingroup$
    @Raeven0 You can think of $p(1)$ as the probability of (eventually) moving from $1 to $0. This the same as the probability of moving from $k to $(k-1), for any k, by the symmetry of this process. I know that $p(1)$ was defined as the probability of reaching zero (starting from 1), but a consequence of this definition is that it is also the probability of losing a dollar starting from any amount.
    $endgroup$
    – Mike Earnest
    Jun 11 '18 at 18:14




    $begingroup$
    @Raeven0 You can think of $p(1)$ as the probability of (eventually) moving from $1 to $0. This the same as the probability of moving from $k to $(k-1), for any k, by the symmetry of this process. I know that $p(1)$ was defined as the probability of reaching zero (starting from 1), but a consequence of this definition is that it is also the probability of losing a dollar starting from any amount.
    $endgroup$
    – Mike Earnest
    Jun 11 '18 at 18:14











    3












    $begingroup$

    As another possible method. Consider a slightly modified game where the game can end if the money reaches $0$ or $N$ dollars (we can take the limit later). Let $P_i$ be the probability that you win starting with $i$ dollars. Suppose the probability of winning a single game is $p$ and $q = 1- p$ is the probability of losing a single game. We can observe the relation
    $$ P_i = p P_{i+1} + q P_{i-1}. tag{1} label{1}$$
    This is a linear difference equation which can easily be solved for roots (via $P_i = z^i$). The result is
    $$P_i = begin{cases} A + B left(frac{q}{p} right)^i & text{if } p neq q, \ A + B i & text{if } p = q = 1/2. end{cases}$$
    Then noting the boundary conditions $P_0 = 0$ and $P_N = 1$, we obtain,
    $$P_i = begin{cases} frac{1-left(frac{q}{p} right)^i}{1- left(frac{q}{p} right)^N} & text{if } p neq q, \ frac{i}{N} & text{if } p = q = 1/2. end{cases}$$
    Taking the limit as $N to infty$, yields the desired result.






    share|cite|improve this answer









    $endgroup$









    • 2




      $begingroup$
      I agree that it's plausible the $N to infty$ limit yields the desired result, but I want to emphasize that has not yet been argued that the probability of winning the OP's game should be related to the limiting probability of winning the $(0,N)$ game.
      $endgroup$
      – Hurkyl
      Jun 11 '18 at 23:35












    • $begingroup$
      Ok, This is neat, and the first answer, I have looked at. I see that you have reduced the problem to gambler's ruin (en.wikipedia.org/wiki/Gambler%27s_ruin). I have no experience solving linear difference equations, is the standard method what is suggested here (cl.cam.ac.uk/teaching/2003/Probability/prob07.pdf), try 1) $P_{i}=Az^{i}$, 2) take i = 2 case, 3) Solve, make linear combination as a guess 4) Sub in guess to original to check it is right
      $endgroup$
      – Tinatim
      Jun 12 '18 at 8:22












    • $begingroup$
      @Hurkyl, Is it not the same problem exactly, with N tending to infinity?
      $endgroup$
      – Tinatim
      Jun 12 '18 at 8:24










    • $begingroup$
      @Tinatim: You'll have to tell me what it means to play this game "with N tending to infinity", and why the probability of winning that game is the limit of the probability of winning the $(0,N)$ games. If you meant that the original problem is the limit as N tends to infinity... you'll have to explain what it means to take the limit of a sequence of games and why the probability of winning should be a continuous function of the game.
      $endgroup$
      – Hurkyl
      Jun 12 '18 at 8:48








    • 1




      $begingroup$
      @Tinatim: ... for example, one way to make sense of a limiting idea here is that events leading to winning the game in the OP has the property that, for every $N$, it starts with a sequence of events that describes a winning $(0,N)$-game. Why is the probability of winning the OP game the limit of the $(0,N)$-games? A limit-like argument can probably be made in this case, but it the situation is not so clear that it goes without saying. I emphasize this precisely because people gloss over this sort of issue without even recognizing it can be a problem.
      $endgroup$
      – Hurkyl
      Jun 12 '18 at 8:53


















    3












    $begingroup$

    As another possible method. Consider a slightly modified game where the game can end if the money reaches $0$ or $N$ dollars (we can take the limit later). Let $P_i$ be the probability that you win starting with $i$ dollars. Suppose the probability of winning a single game is $p$ and $q = 1- p$ is the probability of losing a single game. We can observe the relation
    $$ P_i = p P_{i+1} + q P_{i-1}. tag{1} label{1}$$
    This is a linear difference equation which can easily be solved for roots (via $P_i = z^i$). The result is
    $$P_i = begin{cases} A + B left(frac{q}{p} right)^i & text{if } p neq q, \ A + B i & text{if } p = q = 1/2. end{cases}$$
    Then noting the boundary conditions $P_0 = 0$ and $P_N = 1$, we obtain,
    $$P_i = begin{cases} frac{1-left(frac{q}{p} right)^i}{1- left(frac{q}{p} right)^N} & text{if } p neq q, \ frac{i}{N} & text{if } p = q = 1/2. end{cases}$$
    Taking the limit as $N to infty$, yields the desired result.






    share|cite|improve this answer









    $endgroup$









    • 2




      $begingroup$
      I agree that it's plausible the $N to infty$ limit yields the desired result, but I want to emphasize that has not yet been argued that the probability of winning the OP's game should be related to the limiting probability of winning the $(0,N)$ game.
      $endgroup$
      – Hurkyl
      Jun 11 '18 at 23:35












    • $begingroup$
      Ok, This is neat, and the first answer, I have looked at. I see that you have reduced the problem to gambler's ruin (en.wikipedia.org/wiki/Gambler%27s_ruin). I have no experience solving linear difference equations, is the standard method what is suggested here (cl.cam.ac.uk/teaching/2003/Probability/prob07.pdf), try 1) $P_{i}=Az^{i}$, 2) take i = 2 case, 3) Solve, make linear combination as a guess 4) Sub in guess to original to check it is right
      $endgroup$
      – Tinatim
      Jun 12 '18 at 8:22












    • $begingroup$
      @Hurkyl, Is it not the same problem exactly, with N tending to infinity?
      $endgroup$
      – Tinatim
      Jun 12 '18 at 8:24










    • $begingroup$
      @Tinatim: You'll have to tell me what it means to play this game "with N tending to infinity", and why the probability of winning that game is the limit of the probability of winning the $(0,N)$ games. If you meant that the original problem is the limit as N tends to infinity... you'll have to explain what it means to take the limit of a sequence of games and why the probability of winning should be a continuous function of the game.
      $endgroup$
      – Hurkyl
      Jun 12 '18 at 8:48








    • 1




      $begingroup$
      @Tinatim: ... for example, one way to make sense of a limiting idea here is that events leading to winning the game in the OP has the property that, for every $N$, it starts with a sequence of events that describes a winning $(0,N)$-game. Why is the probability of winning the OP game the limit of the $(0,N)$-games? A limit-like argument can probably be made in this case, but it the situation is not so clear that it goes without saying. I emphasize this precisely because people gloss over this sort of issue without even recognizing it can be a problem.
      $endgroup$
      – Hurkyl
      Jun 12 '18 at 8:53
















    3












    3








    3





    $begingroup$

    As another possible method. Consider a slightly modified game where the game can end if the money reaches $0$ or $N$ dollars (we can take the limit later). Let $P_i$ be the probability that you win starting with $i$ dollars. Suppose the probability of winning a single game is $p$ and $q = 1- p$ is the probability of losing a single game. We can observe the relation
    $$ P_i = p P_{i+1} + q P_{i-1}. tag{1} label{1}$$
    This is a linear difference equation which can easily be solved for roots (via $P_i = z^i$). The result is
    $$P_i = begin{cases} A + B left(frac{q}{p} right)^i & text{if } p neq q, \ A + B i & text{if } p = q = 1/2. end{cases}$$
    Then noting the boundary conditions $P_0 = 0$ and $P_N = 1$, we obtain,
    $$P_i = begin{cases} frac{1-left(frac{q}{p} right)^i}{1- left(frac{q}{p} right)^N} & text{if } p neq q, \ frac{i}{N} & text{if } p = q = 1/2. end{cases}$$
    Taking the limit as $N to infty$, yields the desired result.






    share|cite|improve this answer









    $endgroup$



    As another possible method. Consider a slightly modified game where the game can end if the money reaches $0$ or $N$ dollars (we can take the limit later). Let $P_i$ be the probability that you win starting with $i$ dollars. Suppose the probability of winning a single game is $p$ and $q = 1- p$ is the probability of losing a single game. We can observe the relation
    $$ P_i = p P_{i+1} + q P_{i-1}. tag{1} label{1}$$
    This is a linear difference equation which can easily be solved for roots (via $P_i = z^i$). The result is
    $$P_i = begin{cases} A + B left(frac{q}{p} right)^i & text{if } p neq q, \ A + B i & text{if } p = q = 1/2. end{cases}$$
    Then noting the boundary conditions $P_0 = 0$ and $P_N = 1$, we obtain,
    $$P_i = begin{cases} frac{1-left(frac{q}{p} right)^i}{1- left(frac{q}{p} right)^N} & text{if } p neq q, \ frac{i}{N} & text{if } p = q = 1/2. end{cases}$$
    Taking the limit as $N to infty$, yields the desired result.







    share|cite|improve this answer












    share|cite|improve this answer



    share|cite|improve this answer










    answered Jun 11 '18 at 15:17









    GregoryGregory

    1,326412




    1,326412








    • 2




      $begingroup$
      I agree that it's plausible the $N to infty$ limit yields the desired result, but I want to emphasize that has not yet been argued that the probability of winning the OP's game should be related to the limiting probability of winning the $(0,N)$ game.
      $endgroup$
      – Hurkyl
      Jun 11 '18 at 23:35












    • $begingroup$
      Ok, This is neat, and the first answer, I have looked at. I see that you have reduced the problem to gambler's ruin (en.wikipedia.org/wiki/Gambler%27s_ruin). I have no experience solving linear difference equations, is the standard method what is suggested here (cl.cam.ac.uk/teaching/2003/Probability/prob07.pdf), try 1) $P_{i}=Az^{i}$, 2) take i = 2 case, 3) Solve, make linear combination as a guess 4) Sub in guess to original to check it is right
      $endgroup$
      – Tinatim
      Jun 12 '18 at 8:22












    • $begingroup$
      @Hurkyl, Is it not the same problem exactly, with N tending to infinity?
      $endgroup$
      – Tinatim
      Jun 12 '18 at 8:24










    • $begingroup$
      @Tinatim: You'll have to tell me what it means to play this game "with N tending to infinity", and why the probability of winning that game is the limit of the probability of winning the $(0,N)$ games. If you meant that the original problem is the limit as N tends to infinity... you'll have to explain what it means to take the limit of a sequence of games and why the probability of winning should be a continuous function of the game.
      $endgroup$
      – Hurkyl
      Jun 12 '18 at 8:48








    • 1




      $begingroup$
      @Tinatim: ... for example, one way to make sense of a limiting idea here is that events leading to winning the game in the OP has the property that, for every $N$, it starts with a sequence of events that describes a winning $(0,N)$-game. Why is the probability of winning the OP game the limit of the $(0,N)$-games? A limit-like argument can probably be made in this case, but it the situation is not so clear that it goes without saying. I emphasize this precisely because people gloss over this sort of issue without even recognizing it can be a problem.
      $endgroup$
      – Hurkyl
      Jun 12 '18 at 8:53
















    • 2




      $begingroup$
      I agree that it's plausible the $N to infty$ limit yields the desired result, but I want to emphasize that has not yet been argued that the probability of winning the OP's game should be related to the limiting probability of winning the $(0,N)$ game.
      $endgroup$
      – Hurkyl
      Jun 11 '18 at 23:35












    • $begingroup$
      Ok, This is neat, and the first answer, I have looked at. I see that you have reduced the problem to gambler's ruin (en.wikipedia.org/wiki/Gambler%27s_ruin). I have no experience solving linear difference equations, is the standard method what is suggested here (cl.cam.ac.uk/teaching/2003/Probability/prob07.pdf), try 1) $P_{i}=Az^{i}$, 2) take i = 2 case, 3) Solve, make linear combination as a guess 4) Sub in guess to original to check it is right
      $endgroup$
      – Tinatim
      Jun 12 '18 at 8:22












    • $begingroup$
      @Hurkyl, Is it not the same problem exactly, with N tending to infinity?
      $endgroup$
      – Tinatim
      Jun 12 '18 at 8:24










    • $begingroup$
      @Tinatim: You'll have to tell me what it means to play this game "with N tending to infinity", and why the probability of winning that game is the limit of the probability of winning the $(0,N)$ games. If you meant that the original problem is the limit as N tends to infinity... you'll have to explain what it means to take the limit of a sequence of games and why the probability of winning should be a continuous function of the game.
      $endgroup$
      – Hurkyl
      Jun 12 '18 at 8:48








    • 1




      $begingroup$
      @Tinatim: ... for example, one way to make sense of a limiting idea here is that events leading to winning the game in the OP has the property that, for every $N$, it starts with a sequence of events that describes a winning $(0,N)$-game. Why is the probability of winning the OP game the limit of the $(0,N)$-games? A limit-like argument can probably be made in this case, but it the situation is not so clear that it goes without saying. I emphasize this precisely because people gloss over this sort of issue without even recognizing it can be a problem.
      $endgroup$
      – Hurkyl
      Jun 12 '18 at 8:53










    2




    2




    $begingroup$
    I agree that it's plausible the $N to infty$ limit yields the desired result, but I want to emphasize that has not yet been argued that the probability of winning the OP's game should be related to the limiting probability of winning the $(0,N)$ game.
    $endgroup$
    – Hurkyl
    Jun 11 '18 at 23:35






    $begingroup$
    I agree that it's plausible the $N to infty$ limit yields the desired result, but I want to emphasize that has not yet been argued that the probability of winning the OP's game should be related to the limiting probability of winning the $(0,N)$ game.
    $endgroup$
    – Hurkyl
    Jun 11 '18 at 23:35














    $begingroup$
    Ok, This is neat, and the first answer, I have looked at. I see that you have reduced the problem to gambler's ruin (en.wikipedia.org/wiki/Gambler%27s_ruin). I have no experience solving linear difference equations, is the standard method what is suggested here (cl.cam.ac.uk/teaching/2003/Probability/prob07.pdf), try 1) $P_{i}=Az^{i}$, 2) take i = 2 case, 3) Solve, make linear combination as a guess 4) Sub in guess to original to check it is right
    $endgroup$
    – Tinatim
    Jun 12 '18 at 8:22






    $begingroup$
    Ok, This is neat, and the first answer, I have looked at. I see that you have reduced the problem to gambler's ruin (en.wikipedia.org/wiki/Gambler%27s_ruin). I have no experience solving linear difference equations, is the standard method what is suggested here (cl.cam.ac.uk/teaching/2003/Probability/prob07.pdf), try 1) $P_{i}=Az^{i}$, 2) take i = 2 case, 3) Solve, make linear combination as a guess 4) Sub in guess to original to check it is right
    $endgroup$
    – Tinatim
    Jun 12 '18 at 8:22














    $begingroup$
    @Hurkyl, Is it not the same problem exactly, with N tending to infinity?
    $endgroup$
    – Tinatim
    Jun 12 '18 at 8:24




    $begingroup$
    @Hurkyl, Is it not the same problem exactly, with N tending to infinity?
    $endgroup$
    – Tinatim
    Jun 12 '18 at 8:24












    $begingroup$
    @Tinatim: You'll have to tell me what it means to play this game "with N tending to infinity", and why the probability of winning that game is the limit of the probability of winning the $(0,N)$ games. If you meant that the original problem is the limit as N tends to infinity... you'll have to explain what it means to take the limit of a sequence of games and why the probability of winning should be a continuous function of the game.
    $endgroup$
    – Hurkyl
    Jun 12 '18 at 8:48






    $begingroup$
    @Tinatim: You'll have to tell me what it means to play this game "with N tending to infinity", and why the probability of winning that game is the limit of the probability of winning the $(0,N)$ games. If you meant that the original problem is the limit as N tends to infinity... you'll have to explain what it means to take the limit of a sequence of games and why the probability of winning should be a continuous function of the game.
    $endgroup$
    – Hurkyl
    Jun 12 '18 at 8:48






    1




    1




    $begingroup$
    @Tinatim: ... for example, one way to make sense of a limiting idea here is that events leading to winning the game in the OP has the property that, for every $N$, it starts with a sequence of events that describes a winning $(0,N)$-game. Why is the probability of winning the OP game the limit of the $(0,N)$-games? A limit-like argument can probably be made in this case, but it the situation is not so clear that it goes without saying. I emphasize this precisely because people gloss over this sort of issue without even recognizing it can be a problem.
    $endgroup$
    – Hurkyl
    Jun 12 '18 at 8:53






    $begingroup$
    @Tinatim: ... for example, one way to make sense of a limiting idea here is that events leading to winning the game in the OP has the property that, for every $N$, it starts with a sequence of events that describes a winning $(0,N)$-game. Why is the probability of winning the OP game the limit of the $(0,N)$-games? A limit-like argument can probably be made in this case, but it the situation is not so clear that it goes without saying. I emphasize this precisely because people gloss over this sort of issue without even recognizing it can be a problem.
    $endgroup$
    – Hurkyl
    Jun 12 '18 at 8:53













    0












    $begingroup$

    Unfortuneately i don't have a very intuitive understanding of the problem, but i thought about it before and that's what i came up with:



    Denote $S_n$ be the your money in step $n$ and $S_0=x$ the money you start with. Let $y$ be the (for the time) finite amount of money the bank starts with.
    Let $p$ be the probabilty you win $1$$ and $(1-p)$ the probabily that you lose $1$$.



    Notice that $left(frac{1-p}{p}right)^{S_n}$ is a martingale, which means
    $$mathbb{E}left[left(frac{1-p}{p}right)^{S_n} huge|normalsize left(frac{1-p}{p}right)^{S_k} right] = left(frac{1-p}{p}right)^{S_k}$$
    for $kleq n$.



    Let $tau = operatorname{inf}{k: S_ntext{ is constant for all }ngeq k }$, that is let $tau$ be the time the game ends (you won all the money from your opponent or you lost all your money).



    By the optional stopping theorem,



    $$mathbb{P}left(S_tau = 0 right) left(frac{1-p}{p}right)^{0} + left(1- mathbb{P}left(S_tau = 0 right) right)left(frac{1-p}{p}right)^{x+y} = mathbb{E}left[left(frac{1-p}{p}right)^{S_tau}right] = mathbb{E}left[left(frac{1-p}{p}right)^{S_0}right] = left(frac{1-p}{p}right)^{x},$$
    so
    $$mathbb{P}left(S_tau = 0 right) = frac{left(frac{1-p}{p}right)^{x} - left(frac{1-p}{p}right)^{x+y}}{1 - left(frac{1-p}{p}right)^{x+y}}.$$



    For $(1-p) < p$ we see, by letting $yrightarrow infty$, that for a bank with unlimited resources, we get
    $$mathbb{P}left(S_tau = 0 right) = left(frac{1-p}{p}right)^{x}.$$






    share|cite|improve this answer









    $endgroup$









    • 1




      $begingroup$
      I agree that it's plausible the $y to infty$ limit yields the desired result, but I want to emphasize that has not yet been argued that the probability of winning the OP's game should be related to the limiting probability of winning the bounded version of the game.
      $endgroup$
      – Hurkyl
      Jun 11 '18 at 23:40










    • $begingroup$
      @Hurkyl That is true and im not sure how to do that.
      $endgroup$
      – cdwe
      Jun 11 '18 at 23:44
















    0












    $begingroup$

    Unfortuneately i don't have a very intuitive understanding of the problem, but i thought about it before and that's what i came up with:



    Denote $S_n$ be the your money in step $n$ and $S_0=x$ the money you start with. Let $y$ be the (for the time) finite amount of money the bank starts with.
    Let $p$ be the probabilty you win $1$$ and $(1-p)$ the probabily that you lose $1$$.



    Notice that $left(frac{1-p}{p}right)^{S_n}$ is a martingale, which means
    $$mathbb{E}left[left(frac{1-p}{p}right)^{S_n} huge|normalsize left(frac{1-p}{p}right)^{S_k} right] = left(frac{1-p}{p}right)^{S_k}$$
    for $kleq n$.



    Let $tau = operatorname{inf}{k: S_ntext{ is constant for all }ngeq k }$, that is let $tau$ be the time the game ends (you won all the money from your opponent or you lost all your money).



    By the optional stopping theorem,



    $$mathbb{P}left(S_tau = 0 right) left(frac{1-p}{p}right)^{0} + left(1- mathbb{P}left(S_tau = 0 right) right)left(frac{1-p}{p}right)^{x+y} = mathbb{E}left[left(frac{1-p}{p}right)^{S_tau}right] = mathbb{E}left[left(frac{1-p}{p}right)^{S_0}right] = left(frac{1-p}{p}right)^{x},$$
    so
    $$mathbb{P}left(S_tau = 0 right) = frac{left(frac{1-p}{p}right)^{x} - left(frac{1-p}{p}right)^{x+y}}{1 - left(frac{1-p}{p}right)^{x+y}}.$$



    For $(1-p) < p$ we see, by letting $yrightarrow infty$, that for a bank with unlimited resources, we get
    $$mathbb{P}left(S_tau = 0 right) = left(frac{1-p}{p}right)^{x}.$$






    share|cite|improve this answer









    $endgroup$









    • 1




      $begingroup$
      I agree that it's plausible the $y to infty$ limit yields the desired result, but I want to emphasize that has not yet been argued that the probability of winning the OP's game should be related to the limiting probability of winning the bounded version of the game.
      $endgroup$
      – Hurkyl
      Jun 11 '18 at 23:40










    • $begingroup$
      @Hurkyl That is true and im not sure how to do that.
      $endgroup$
      – cdwe
      Jun 11 '18 at 23:44














    0












    0








    0





    $begingroup$

    Unfortuneately i don't have a very intuitive understanding of the problem, but i thought about it before and that's what i came up with:



    Denote $S_n$ be the your money in step $n$ and $S_0=x$ the money you start with. Let $y$ be the (for the time) finite amount of money the bank starts with.
    Let $p$ be the probabilty you win $1$$ and $(1-p)$ the probabily that you lose $1$$.



    Notice that $left(frac{1-p}{p}right)^{S_n}$ is a martingale, which means
    $$mathbb{E}left[left(frac{1-p}{p}right)^{S_n} huge|normalsize left(frac{1-p}{p}right)^{S_k} right] = left(frac{1-p}{p}right)^{S_k}$$
    for $kleq n$.



    Let $tau = operatorname{inf}{k: S_ntext{ is constant for all }ngeq k }$, that is let $tau$ be the time the game ends (you won all the money from your opponent or you lost all your money).



    By the optional stopping theorem,



    $$mathbb{P}left(S_tau = 0 right) left(frac{1-p}{p}right)^{0} + left(1- mathbb{P}left(S_tau = 0 right) right)left(frac{1-p}{p}right)^{x+y} = mathbb{E}left[left(frac{1-p}{p}right)^{S_tau}right] = mathbb{E}left[left(frac{1-p}{p}right)^{S_0}right] = left(frac{1-p}{p}right)^{x},$$
    so
    $$mathbb{P}left(S_tau = 0 right) = frac{left(frac{1-p}{p}right)^{x} - left(frac{1-p}{p}right)^{x+y}}{1 - left(frac{1-p}{p}right)^{x+y}}.$$



    For $(1-p) < p$ we see, by letting $yrightarrow infty$, that for a bank with unlimited resources, we get
    $$mathbb{P}left(S_tau = 0 right) = left(frac{1-p}{p}right)^{x}.$$






    share|cite|improve this answer









    $endgroup$



    Unfortuneately i don't have a very intuitive understanding of the problem, but i thought about it before and that's what i came up with:



    Denote $S_n$ be the your money in step $n$ and $S_0=x$ the money you start with. Let $y$ be the (for the time) finite amount of money the bank starts with.
    Let $p$ be the probabilty you win $1$$ and $(1-p)$ the probabily that you lose $1$$.



    Notice that $left(frac{1-p}{p}right)^{S_n}$ is a martingale, which means
    $$mathbb{E}left[left(frac{1-p}{p}right)^{S_n} huge|normalsize left(frac{1-p}{p}right)^{S_k} right] = left(frac{1-p}{p}right)^{S_k}$$
    for $kleq n$.



    Let $tau = operatorname{inf}{k: S_ntext{ is constant for all }ngeq k }$, that is let $tau$ be the time the game ends (you won all the money from your opponent or you lost all your money).



    By the optional stopping theorem,



    $$mathbb{P}left(S_tau = 0 right) left(frac{1-p}{p}right)^{0} + left(1- mathbb{P}left(S_tau = 0 right) right)left(frac{1-p}{p}right)^{x+y} = mathbb{E}left[left(frac{1-p}{p}right)^{S_tau}right] = mathbb{E}left[left(frac{1-p}{p}right)^{S_0}right] = left(frac{1-p}{p}right)^{x},$$
    so
    $$mathbb{P}left(S_tau = 0 right) = frac{left(frac{1-p}{p}right)^{x} - left(frac{1-p}{p}right)^{x+y}}{1 - left(frac{1-p}{p}right)^{x+y}}.$$



    For $(1-p) < p$ we see, by letting $yrightarrow infty$, that for a bank with unlimited resources, we get
    $$mathbb{P}left(S_tau = 0 right) = left(frac{1-p}{p}right)^{x}.$$







    share|cite|improve this answer












    share|cite|improve this answer



    share|cite|improve this answer










    answered Jun 11 '18 at 14:51









    cdwecdwe

    5211314




    5211314








    • 1




      $begingroup$
      I agree that it's plausible the $y to infty$ limit yields the desired result, but I want to emphasize that has not yet been argued that the probability of winning the OP's game should be related to the limiting probability of winning the bounded version of the game.
      $endgroup$
      – Hurkyl
      Jun 11 '18 at 23:40










    • $begingroup$
      @Hurkyl That is true and im not sure how to do that.
      $endgroup$
      – cdwe
      Jun 11 '18 at 23:44














    • 1




      $begingroup$
      I agree that it's plausible the $y to infty$ limit yields the desired result, but I want to emphasize that has not yet been argued that the probability of winning the OP's game should be related to the limiting probability of winning the bounded version of the game.
      $endgroup$
      – Hurkyl
      Jun 11 '18 at 23:40










    • $begingroup$
      @Hurkyl That is true and im not sure how to do that.
      $endgroup$
      – cdwe
      Jun 11 '18 at 23:44








    1




    1




    $begingroup$
    I agree that it's plausible the $y to infty$ limit yields the desired result, but I want to emphasize that has not yet been argued that the probability of winning the OP's game should be related to the limiting probability of winning the bounded version of the game.
    $endgroup$
    – Hurkyl
    Jun 11 '18 at 23:40




    $begingroup$
    I agree that it's plausible the $y to infty$ limit yields the desired result, but I want to emphasize that has not yet been argued that the probability of winning the OP's game should be related to the limiting probability of winning the bounded version of the game.
    $endgroup$
    – Hurkyl
    Jun 11 '18 at 23:40












    $begingroup$
    @Hurkyl That is true and im not sure how to do that.
    $endgroup$
    – cdwe
    Jun 11 '18 at 23:44




    $begingroup$
    @Hurkyl That is true and im not sure how to do that.
    $endgroup$
    – cdwe
    Jun 11 '18 at 23:44











    0












    $begingroup$

    Let $p(n)$ be the probability of ever reaching $n-1$ coins (dollars) starting with $n$ coins. We have that



    $$p(n) = 1/4 + 3/4 cdot p(n+1) cdot p(n)$$



    Since we can either reach $n-1$ coins directly by losing or win one coin then reach $n$ coins with probability $p(n+1)$ and then reach $n-1$ coins with probability $p(n)$.



    But we can see that $p(n+1) = p(n)$, because it's the same the probability of eventually losing one coin, it doesn't matter if we have $n$ or $n+1$ coins to begin with. Substituting, we have



    $$p(n) = 1/4 + 3/4 cdot p(n)^2$$



    Solving this quadratic equation we have $p(n) = 1/3$ or $p(n) = 1$. We can ignore the case where $p(n) = 1$ since the probability of losing is not 1.



    Finally, the probability of losing is



    $$p(2) * p(1) = (1/3)^2 = 1/9$$






    share|cite|improve this answer









    $endgroup$


















      0












      $begingroup$

      Let $p(n)$ be the probability of ever reaching $n-1$ coins (dollars) starting with $n$ coins. We have that



      $$p(n) = 1/4 + 3/4 cdot p(n+1) cdot p(n)$$



      Since we can either reach $n-1$ coins directly by losing or win one coin then reach $n$ coins with probability $p(n+1)$ and then reach $n-1$ coins with probability $p(n)$.



      But we can see that $p(n+1) = p(n)$, because it's the same the probability of eventually losing one coin, it doesn't matter if we have $n$ or $n+1$ coins to begin with. Substituting, we have



      $$p(n) = 1/4 + 3/4 cdot p(n)^2$$



      Solving this quadratic equation we have $p(n) = 1/3$ or $p(n) = 1$. We can ignore the case where $p(n) = 1$ since the probability of losing is not 1.



      Finally, the probability of losing is



      $$p(2) * p(1) = (1/3)^2 = 1/9$$






      share|cite|improve this answer









      $endgroup$
















        0












        0








        0





        $begingroup$

        Let $p(n)$ be the probability of ever reaching $n-1$ coins (dollars) starting with $n$ coins. We have that



        $$p(n) = 1/4 + 3/4 cdot p(n+1) cdot p(n)$$



        Since we can either reach $n-1$ coins directly by losing or win one coin then reach $n$ coins with probability $p(n+1)$ and then reach $n-1$ coins with probability $p(n)$.



        But we can see that $p(n+1) = p(n)$, because it's the same the probability of eventually losing one coin, it doesn't matter if we have $n$ or $n+1$ coins to begin with. Substituting, we have



        $$p(n) = 1/4 + 3/4 cdot p(n)^2$$



        Solving this quadratic equation we have $p(n) = 1/3$ or $p(n) = 1$. We can ignore the case where $p(n) = 1$ since the probability of losing is not 1.



        Finally, the probability of losing is



        $$p(2) * p(1) = (1/3)^2 = 1/9$$






        share|cite|improve this answer









        $endgroup$



        Let $p(n)$ be the probability of ever reaching $n-1$ coins (dollars) starting with $n$ coins. We have that



        $$p(n) = 1/4 + 3/4 cdot p(n+1) cdot p(n)$$



        Since we can either reach $n-1$ coins directly by losing or win one coin then reach $n$ coins with probability $p(n+1)$ and then reach $n-1$ coins with probability $p(n)$.



        But we can see that $p(n+1) = p(n)$, because it's the same the probability of eventually losing one coin, it doesn't matter if we have $n$ or $n+1$ coins to begin with. Substituting, we have



        $$p(n) = 1/4 + 3/4 cdot p(n)^2$$



        Solving this quadratic equation we have $p(n) = 1/3$ or $p(n) = 1$. We can ignore the case where $p(n) = 1$ since the probability of losing is not 1.



        Finally, the probability of losing is



        $$p(2) * p(1) = (1/3)^2 = 1/9$$







        share|cite|improve this answer












        share|cite|improve this answer



        share|cite|improve this answer










        answered Jan 16 at 5:21









        Ana EchavarriaAna Echavarria

        445




        445






























            draft saved

            draft discarded




















































            Thanks for contributing an answer to Mathematics Stack Exchange!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            Use MathJax to format equations. MathJax reference.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f2815752%2fgambling-game-what-is-the-probability-of-eventually-going-broke%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            Mario Kart Wii

            What does “Dominus providebit” mean?

            Antonio Litta Visconti Arese