I was just thinking about gambling and calculating the risk of ruin. For simplicity's sake, let's consider a game with only two outcomes (win or lose), a probability of winning Pw = 1 - Pl, and a given number of betting units B to gamble with. What is the probability that you will lose all your betting units at some point in the game?
(I know approximate formulas exist for this, but can anyone come up with an analytical solution? It's more complicated than you'd think...)
Originally posted by PBE6If the game goes on infinitely long, isn't the answer 1? (unless P_w = 1). It's basically a random walk, which will reach zero at some point.
I was just thinking about gambling and calculating the risk of ruin. For simplicity's sake, let's consider a game with only two outcomes (win or lose), a probability of winning Pw = 1 - Pl, and a given number of betting units B to gamble with. What is the probability that you will lose all your betting units at some point in the game?
(I know approximate f ...[text shortened]... can anyone come up with an analytical solution? It's more complicated than you'd think...)
If it's not inifinitely long, what's the stopping criterion?
If that's not it, I think you need to be a bit more specific with the problem.
Originally posted by PBE6How much can one bet at a time?
I was just thinking about gambling and calculating the risk of ruin. For simplicity's sake, let's consider a game with only two outcomes (win or lose), a probability of winning Pw = 1 - Pl, and a given number of betting units B to gamble with. What is the probability that you will lose all your betting units at some point in the game?
(I know approximate f ...[text shortened]... can anyone come up with an analytical solution? It's more complicated than you'd think...)
Originally posted by mtthwYes, you're right. I should have said "what's the risk of ruin after playing 'n' hands?"
If the game goes on infinitely long, isn't the answer 1? (unless P_w = 1). It's basically a random walk, which will reach zero at some point.
If it's not inifinitely long, what's the stopping criterion?
If that's not it, I think you need to be a bit more specific with the problem.
Originally posted by mtthwthis is simply not true. This is only true if your expected value is negatif, thus Pw is smaller than Pl
If the game goes on infinitely long, isn't the answer 1? (unless P_w = 1). It's basically a random walk, which will reach zero at some point.
If it's not inifinitely long, what's the stopping criterion?
If that's not it, I think you need to be a bit more specific with the problem.
Originally posted by DarrieI withdraw the previous statement here. While doing the calculation I realised you're quite right (though the answer is 1 if Pw = Pl = 1/2 as well). Sorry!
this is simply not true. This is only true if your expected value is negatif, thus Pw is smaller than Pl
Originally posted by mtthwI would argue that for every bet, not only does it create an infinite amount of lines that end in bankruptcy, it also creates and infinite amount of lines that continue forever.
Afraid not. A one-dimensional random walk (which this is, until you hit zero) given infinite time will hit all points an infinite number of times "almost surely" (i.e. with probability 1). As long as P_w is less than one you're stuffed in the long run. Infinity is a long time!
That's why I reason it to be 50% over infinite hands.
Originally posted by uzlessYou've got to be careful with inifinities 🙂
I would argue that for every bet, not only does it create an infinite amount of lines that end in bankruptcy, it also creates and infinite amount of lines that continue forever.
That's why I reason it to be 50% over infinite hands.
Look at it this way. Let's say B = 1, and P_w = 0.1. The risk of ruin is at least 0.9, as there's a 0.9 chance of losing everything on the first turn. So it can't be 50%. It either has to be 1, or a function of B and P_w.
Originally posted by mtthwActually, the answer may not be 1 even for an infinite number of hands. Here's a draft formula I'm working out. The probability of winning "m" bets and losing B bets is:
If the game goes on infinitely long, isn't the answer 1? (unless P_w = 1). It's basically a random walk, which will reach zero at some point.
If it's not inifinitely long, what's the stopping criterion?
If that's not it, I think you need to be a bit more specific with the problem.
P = A(m)*(Pw^m)*(Pl^(B+m))
where A(m) is the number of valid ways to arrange the wins and losses. This is a little bit complicated, because not all arrangements are valid (you can't have more than B-1 losses in a row before getting a few wins), but as an upper estimate for A(m) we can say it's the total number of ways to arrange "m" wins in B+m trials, which is equal to (B+m)choose(m). The total risk of ruin is just the sum of these terms from m=0 to m=b as b --} infinity. This sum converges for 0 < Pw < 1.
Originally posted by PBE6As I belatedly realised, and posted above, you're right.
Actually, the answer may not be 1 even for an infinite number of hands. Here's a draft formula I'm working out. The probability of winning "m" bets and losing B bets is:
P = A(m)*(Pw^m)*(Pl^(B+m))
where A(m) is the number of valid ways to arrange the wins and losses. This is a little bit complicated, because not all arrangements are valid (you can't ha ...[text shortened]... um of these terms from m=0 to m=b as b --} infinity. This sum converges for 0 < Pw < 1.
For the infinite case, you can use a difference equation. If R(B) is the probability of ruin if you start at B, then:
R(B) = Pw R(B+1) + Pl R(B - 1)
You can find more details at: http://www.fooledbyrandomness.com/gamblersruin.pdf (saves me typing out the argument), but for Pw > 1/2 you get:
R(B) = (1/Pw - 1)^B
Originally posted by mtthwYou can't lose everything on the first turn if you can only bet one at a time....unless the question assumes you start with only one piece to bet with
You've got to be careful with inifinities 🙂
Look at it this way. Let's say B = 1, and P_w = 0.1. The risk of ruin is at least 0.9, as there's a 0.9 chance of losing everything on the first turn. So it can't be 50%. It either has to be 1, or a function of B and P_w.
Originally posted by uzlessAs I said, let B = 1. I took one specific case and showed the answer couldn't be 50%. Therefore it can't be 50% in all cases.
You can't lose everything on the first turn if you can only bet one at a time....unless the question assumes you start with only one piece to bet with
Originally posted by mtthw
As I belatedly realised, and posted above, you're right.
For the infinite case, you can use a difference equation. If R(B) is the probability of ruin if you start at B, then:
R(B) = Pw R(B+1) + Pl R(B - 1)
You can find more details at: http://www.fooledbyrandomness.com/gamblersruin.pdf (saves me typing out the argument), but for Pw > 1/2 you get:
R(B) = (1/Pw - 1)^B