- 14 Mar '11 00:37Has this one been done?

You own a casino. A bright young employee suggests a new game. Players will pay a fee to enter. Then they engage in a series of turns that are really, the same as 50/50 coin tosses, and they know this. If their first tail occurs on turn 1, they get $1. If their first tail is on turn 2, they get $2. If their first tail is on turn 3, they get $4. If it's on turn 4, they get $8. Their potential winnings at the next turn are double the potential winnings at the turn preceding it. They can flip until they flip a tail, or they can declare a tail at a turn they have successfully arrived at, and take the winnings for that turn. After they flip or declare a tail, the game is over.

You know what your target profit margin is, but to establish a profitable fee for this game, you need to know the break-even point; the point at which the income from the fee covers the payouts to winners over time. How much would you need to charge as an entry fee, to break even in this game, over time? Is there more information you need, before you can decide? If so, what is it? Are there any constraints you need to place on the game, before you can set a fee? If so, what are the constraints? - 15 Mar '11 08:59Assumption: the payout is determined by the first instance of a tail being thrown, at which point the gambler can no longer flip the coin.

The expected payout {E(payout)} for any number of flips is $0.50 as Thomaster pointed out, for instance:

~ the probability of flipping a tail on the first attempt is 50% and the payout is $1 giving E(payout) of $0.50 (50% * $1)

~ the probability of flipping a tail on the second attempt is 25% and the payout is $2 giving E(payout) of $0.50 (25% * $2)

~ the probability of flipping a tail on the third attempt is 12.5% and the payout is $4 giving E(payout) of $0.50 (12.5% * $4)

~ and so on.

This can be expressed as a formula:

E(payout) for n flips is = 0.5^n * $2^(n-1) = which for any value of n gives an expected payout of $0.50.

HOWEVER, the expected payout for any one gambler is the sum of E(payout) for the values of n from 1 to z, where z is any positive integer. We need to sum the E(payout) values given the gambler has a 100% certainty of being paid out. The sum of probabilities (the 0.5^n part expressed as sum(0.5^n) for n = 1 to z) is a function that approaches the value 1, but never quite reaches it for any value of z. But the casino must cover 100% of probable outcomes (to be sure of its potential commitments and profits) so to we require a z value of infinity.

This will result in a total expected payout (covering all possible outcomes) of 50% of infinity, which is nonsensical / contradictory. So the problem cannot be solved from a purely mathematical perspective.

But for practical purposes, if you choose a z value less than infinity, there is a 1/(z+1) probability the gambler will throw a tail on the (z+1)th attempt, with an astronomical payout. So the casino will eventually lose, or the entry price is such that it is not economic for the gambler.

For instance, if you select a z value of 20 (to cover the first 20 flips), you have covered 99.999905% of all possible outcomes. At this comfort level, the entry price would need to be $10 for the casino to break even, and you can then add your profit margin to that. However, the casino has not covered the probability of throwing a tail on the 21st attempt, which would result in a payout of over $1m, so the casino will eventually lose (and the employee will lose his job!).

So for practical purposes you would enforce a limit of z throws, and the break even entry price would be z * $0.50. So for a z value of 20, the break even price is $10. Assuming the casino wants to make a margin of $1 per game (or 9.1% on the entry price) then the entry price would be $11, with a chance of winning approx $0.5m.

So in answer to the question, pick a z value, pick a margin and good luck!

Andrew

P.S. Why the puzzles? - 15 Mar '11 09:13 / 1 editMathematical proof for z = 15, gamblers = 1m, entry price = $8 (margin of 50c).

Entry Fees collected = $8m.

Based on probabilities:

500k people collect $1 = $500k paid out

+ 250k people collect $2 = $500k paid out

and so on up to

31 people collect $16k = $508k paid out.

Sum of payouts (rounding winners to 0 dp) = $7.5m.

Margin = $0.5m which equates to a margin of $0.50c * 1m gamblers. There is a little bit of rounding but if I keep the figures in $million at 1dp then it is ok.

Andrew - 15 Mar '11 14:18 / 1 edit

I knew that. Having slept 18 hours, I hope I will do better now:*Originally posted by JS357***Maybe I didn't explain it clearly. Nobody leaves the game with less than a dollar, so a $0.50 fee loses at least $0.50 per play.**

1/2 × 1 + 1/4 × 2 + 1/8 × 4 + ....

= 0,5 + 0,5 + 0,5 + ...

= inf.

The casino can either chose not to play this game, or set a fee of $inf+1. - 15 Mar '11 15:45

This is discussed at*Originally posted by Thomaster***I knew that. Having slept 18 hours, I hope I will do better now:**

1/2 × 1 + 1/4 × 2 + 1/8 × 4 + ....

= 0,5 + 0,5 + 0,5 + ...

= inf.

The casino can either chose not to play this game, or set a fee of $inf+1.

http://plato.stanford.edu/entries/paradox-stpetersburg/

It started with Bernoulli and led to the concept of decreasing marginal utility in a risk situation, because based simply on expected dollar return, the rational player would enter the game for any finite fee, but how many of us would risk a million dollars to play it? - 15 Mar '11 22:28 / 1 editI think Andrew's suggestion of a maximum number of flips (and therefore a maximum possible payout) could make this game make sense.

I think the interesting problem here lies with make the game as interesting as possible for the consumer (perceived low fee with perceived high payout possibilities) and still profitable for the casino.

I probably wouldn't pay $2 for the chance to win $8, but I might pay $6 for the chance to win $2k, or $8 for the chance to win $33.

However, as your entry fee goes up, the geometric mean of your payout goes down, and a lower proportion of the consumer population wins. - 16 Mar '11 09:26 / 1 edit@JS: I'm not sure anyone would risk $1m to play an infinite game given there is a 75% chance of walking away with $2 or less. I think you will find a huge proportion { e.g. >99.9% } of the population would be sufficiently risk averse despite the theoretical possibility of winning infinite dollars.

Interesting (but long) article - well intentioned but I think it misses the point when it refers to diminishing marginal returns - I don't agree with that at all! That will kick in only once the gambler can no longer distinguish the value/worth of two differing prizes. I am referring to values like $140 billion versus $240 billion - you could never spend the difference of $100 billion in a lifetime so the marginal benefit of the greater prize is zero. But I believe it is wrong to apply the argument to values less than say $1 billion.

@forkedknight: I agree - you need to strike a price where the upper limit is sufficiently tempting such that you don't mind risking a few bucks.