- 21 Nov '06 19:26I was just thinking about gambling and calculating the risk of ruin. For simplicity's sake, let's consider a game with only two outcomes (win or lose), a probability of winning Pw = 1 - Pl, and a given number of betting units B to gamble with. What is the probability that you will lose all your betting units at some point in the game?

(I know approximate formulas exist for this, but can anyone come up with an analytical solution? It's more complicated than you'd think...) - 21 Nov '06 23:18

If the game goes on infinitely long, isn't the answer 1? (unless P_w = 1). It's basically a random walk, which will reach zero at some point.*Originally posted by PBE6***I was just thinking about gambling and calculating the risk of ruin. For simplicity's sake, let's consider a game with only two outcomes (win or lose), a probability of winning Pw = 1 - Pl, and a given number of betting units B to gamble with. What is the probability that you will lose all your betting units at some point in the game?**

(I know approximate f ...[text shortened]... can anyone come up with an analytical solution? It's more complicated than you'd think...)

If it's not inifinitely long, what's the stopping criterion?

If that's not it, I think you need to be a bit more specific with the problem. - 21 Nov '06 23:27

How much can one bet at a time?*Originally posted by PBE6***I was just thinking about gambling and calculating the risk of ruin. For simplicity's sake, let's consider a game with only two outcomes (win or lose), a probability of winning Pw = 1 - Pl, and a given number of betting units B to gamble with. What is the probability that you will lose all your betting units at some point in the game?**

(I know approximate f ...[text shortened]... can anyone come up with an analytical solution? It's more complicated than you'd think...) - 22 Nov '06 14:53

Yes, you're right. I should have said "what's the risk of ruin after playing 'n' hands?"*Originally posted by mtthw***If the game goes on infinitely long, isn't the answer 1? (unless P_w = 1). It's basically a random walk, which will reach zero at some point.**

If it's not inifinitely long, what's the stopping criterion?

If that's not it, I think you need to be a bit more specific with the problem. - 22 Nov '06 16:06 / 2 edits

this is simply not true. This is only true if your expected value is negatif, thus Pw is smaller than Pl*Originally posted by mtthw***If the game goes on infinitely long, isn't the answer 1? (unless P_w = 1). It's basically a random walk, which will reach zero at some point.**

If it's not inifinitely long, what's the stopping criterion?

If that's not it, I think you need to be a bit more specific with the problem. - 22 Nov '06 16:25 / 1 edit

I withdraw the previous statement here. While doing the calculation I realised you're quite right (though the answer is 1 if Pw = Pl = 1/2 as well). Sorry!*Originally posted by Darrie***this is simply not true. This is only true if your expected value is negatif, thus Pw is smaller than Pl** - 22 Nov '06 16:27

I would argue that for every bet, not only does it create an infinite amount of lines that end in bankruptcy, it also creates and infinite amount of lines that continue forever.*Originally posted by mtthw***Afraid not. A one-dimensional random walk (which this is, until you hit zero) given infinite time will hit all points an infinite number of times "almost surely" (i.e. with probability 1). As long as P_w is less than one you're stuffed in the long run. Infinity is a long time!**

That's why I reason it to be 50% over infinite hands. - 22 Nov '06 16:42 / 1 edit

You've got to be careful with inifinities*Originally posted by uzless***I would argue that for every bet, not only does it create an infinite amount of lines that end in bankruptcy, it also creates and infinite amount of lines that continue forever.**

That's why I reason it to be 50% over infinite hands.

Look at it this way. Let's say B = 1, and P_w = 0.1. The risk of ruin is*at least*0.9, as there's a 0.9 chance of losing everything on the first turn. So it can't be 50%. It either has to be 1, or a function of B and P_w. - 22 Nov '06 17:19

Actually, the answer may not be 1 even for an infinite number of hands. Here's a draft formula I'm working out. The probability of winning "m" bets and losing B bets is:*Originally posted by mtthw***If the game goes on infinitely long, isn't the answer 1? (unless P_w = 1). It's basically a random walk, which will reach zero at some point.**

If it's not inifinitely long, what's the stopping criterion?

If that's not it, I think you need to be a bit more specific with the problem.

P = A(m)*(Pw^m)*(Pl^(B+m))

where A(m) is the number of valid ways to arrange the wins and losses. This is a little bit complicated, because not all arrangements are valid (you can't have more than B-1 losses in a row before getting a few wins), but as an upper estimate for A(m) we can say it's the total number of ways to arrange "m" wins in B+m trials, which is equal to (B+m)choose(m). The total risk of ruin is just the sum of these terms from m=0 to m=b as b --} infinity. This sum converges for 0 < Pw < 1. - 22 Nov '06 17:31

As I belatedly realised, and posted above, you're right.*Originally posted by PBE6***Actually, the answer may not be 1 even for an infinite number of hands. Here's a draft formula I'm working out. The probability of winning "m" bets and losing B bets is:**

P = A(m)*(Pw^m)*(Pl^(B+m))

where A(m) is the number of valid ways to arrange the wins and losses. This is a little bit complicated, because not all arrangements are valid (you can't ha ...[text shortened]... um of these terms from m=0 to m=b as b --} infinity. This sum converges for 0 < Pw < 1.

For the infinite case, you can use a difference equation. If R(B) is the probability of ruin if you start at B, then:

R(B) = Pw R(B+1) + Pl R(B - 1)

You can find more details at: http://www.fooledbyrandomness.com/gamblersruin.pdf (saves me typing out the argument), but for Pw > 1/2 you get:

R(B) = (1/Pw - 1)^B - 22 Nov '06 17:32 / 1 edit

You can't lose everything on the first turn if you can only bet one at a time....unless the question assumes you start with only one piece to bet with*Originally posted by mtthw***You've got to be careful with inifinities**

Look at it this way. Let's say B = 1, and P_w = 0.1. The risk of ruin is*at least*0.9, as there's a 0.9 chance of losing everything on the first turn. So it can't be 50%. It either has to be 1, or a function of B and P_w. - 22 Nov '06 17:36

As I said, let B = 1. I took one specific case and showed the answer couldn't be 50%. Therefore it can't be 50% in all cases.*Originally posted by uzless***You can't lose everything on the first turn if you can only bet one at a time....unless the question assumes you start with only one piece to bet with** - 22 Nov '06 17:40 / 1 edit
*Originally posted by mtthw***As I belatedly realised, and posted above, you're right.**

For the infinite case, you can use a difference equation. If R(B) is the probability of ruin if you start at B, then:

R(B) = Pw R(B+1) + Pl R(B - 1)

You can find more details at: http://www.fooledbyrandomness.com/gamblersruin.pdf (saves me typing out the argument), but for Pw > 1/2 you get:

R(B) = (1/Pw - 1)^B