- 29 Mar '07 02:32 / 3 editsTwo envelopes, connected by a string, are in a box. One envelope contains $1 and the other one $10. Now a coin is tossed. If the result is "head" another pair of envelopes is put into the box, this time with $10 and $100; otherwise the experiment is stopped.

If the experiment was not stopped, again a coin is tossed. If the result is "head" another pair of envelopes is put into the box, this time with $100 and $1000; otherwise the experiment is stopped.

If the experiment was not stopped, again a coin is tossed. If the result is "head" another pair of envelopes is put into the box, this time with $1000 and $10000; otherwise the experiment is stopped.

And so on, until we get a "tail".

You don't know how many envelopes there are in the box. You take a connected pair of envelopes without looking into the box, and open one envelope. Inside you see x dollars. Now you are given the choice to take the x dollars, or take the sum in the other envelope. What do you do?

If your answer is "always swap envelopes", then why not take the other envelope in the first place? - 29 Mar '07 03:23

x=1, 10, 100 => swap*Originally posted by David113***Two envelopes, connected by a string, are in a box. One envelope contains $1 and the other one $10. Now a coin is tossed. If the result is "head" another pair of envelopes is put into the box, this time with $10 and $100; otherwise the experiment is stopped.**

If the experiment was not stopped, again a coin is tossed. If the result is "head" another pair of ...[text shortened]... is "always swap envelopes", then why not take the other envelope in the first place?

x=1000 or above => stay

The odds of any envelope having 10000 are 1:16, and the payoff is only 10:1. - 29 Mar '07 11:07

I was too hasty with that answer. I'll have another go.*Originally posted by luskin***If the amount you find is more than $1, then by swapping there is always a chance you'll end up with less. Since you don't know how many envelopes there are, you can't be sure there is any chance of doing better. So I'd say you only swap if you find $1.**

Assume the amount you find in the envelope is $100. There is a 50% chance there will be an amount larger than $100 somewhere in the box, and if so then there's a 50% chance that you have $1000 in the other envelope. So the probability of $1000 is 0.25 and probability of $10 is 0.75.

The expectation if you switch is therefore..

(0.5*0.5*1000)+(0.75*10)=$257.50

In general if you switch.. expectation =2.575*x

Always switch. - 29 Mar '07 15:11

How did you come up with that?*Originally posted by luskin***There is a 50% chance there will be an amount larger than $100 somewhere in the box...**

There's a 50% chance there will be 2 envelopes in the box ($1 and $10).

There's a 25% chance there will be 4 envelopes in the box (1 and 10, 10 and 100).

There's a 12.5% chance there will be 6 envelopes in the box (1 and 10, 10 and 100, 100 and 1000).

etc. - 29 Mar '07 15:42 / 1 edit

Here's my go:*Originally posted by David113***Two envelopes, connected by a string, are in a box. One envelope contains $1 and the other one $10. Now a coin is tossed. If the result is "head" another pair of envelopes is put into the box, this time with $10 and $100; otherwise the experiment is stopped.**

If the experiment was not stopped, again a coin is tossed. If the result is "head" another pair of is "always swap envelopes", then why not take the other envelope in the first place?

If you create a probability tree for the envelope pairs, you can see that the probability of the maximum cash value in the box being 10^n is 0.5^n (i.e. P(1/10) = 0.5^1, P(10/100) = 0.5^2, etc...). Now when you choose an envelope, you discover that the value inside is 10^n. The probability that you are dealing with the 10^(n-1)/10^n envelope is simply 0.5^n, and the probability that you are dealing with the 10^n/10(n+1) envelope is 0.5^(n+1). Since all other options are excluded once you pick a given envelope, the normalized probabilities are:

P(10^(n-1)/10^n envelope) = 0.5^n / (0.5^n + 0.5^(n+1))

P(10^n/10^(n+1) envelope) = 0.5^(n+1) / (0.5^n + 0.5^(n+1))

Say you switch; to determine the expected value of switching we multiply the outcomes by their respective probabilities and add them up as follows:

E(switch to 10^(n+1)) = 10^(n+1) * 0.5^(n+1) / (0.5^n + 0.5^(n+1))

E(switch to 10(n-1)) = 10^(n-1) * 0.5^n / (0.5^n + 0.5^(n+1))

Adding these two, we find that the expected value of switching is:

E(switch) = (0.5*10^(n+1) + 10^(n-1)) * 0.5^n / (0.5^n + 0.5^(n+1))

After crunching a few numbers on Excel, it turns out that E(switch) given an initial value of 10^n is equal to 34*10^(n-1). Now, the expected value of staying is simply 10^n. Comparing the two, we find that:

n=1

E(switch) = 34 > E(stay) = 10

n=2

E(switch) = 340, E(stay) = 100

and, more generally:

E(switch) = 34*10^(n-1) > E(stay) = 10^n

Therefore, it is always advantageous to switch.

I'm not quite sure how to answer the second part: if it's always advantageous to switch envelopes then why not choose the other envelope in the first place? My first instinct is to say that if you continually second-guess your choice and therefore never make one, the expected value of your action is simply 0. Therefore it is always better to choose an envelope and then switch than to choose neither. I'm not sure how satisfying this answer is...but at least it's more satisfying than the question.

Another suggestion I've heard about this kind of problem is that people do not evaluate their desire for increasing sums of money using simple subtraction or division - a log scale is much more representative. For example, the desire for $10,000 would be closer to log(10,000) = 4 as compared to the desire for $100,000 which would be log(100,000) = 5. To be sure, the larger sum is always desired more, but if we assume a human threshold where the perceived risks and rewards balance out in the chooser's head (say, if the ratio of the logs falls below 1.10:1, a difference of less than 10% ), then we can decide whether or not the chooser would want to switch given a certain sum. In this problem, if the chooser found $1,000,000, the ratio of the log of E(switch) = log($3,400,000) = 6.531 and the log of E(stay) = log($1,000,000) = 6 is 6.531/6 = 108.9%, and the chooser would stay with $1,000,000. - 29 Mar '07 17:19 / 2 edits

Oops!! Misread the question...I thought only one pair of envelopes was in the box at once. I'll try again with the correct scenario.*Originally posted by PBE6***Here's my go:**

If you create a probability tree for the envelope pairs, you can see that the probability of the maximum cash value in the box being 10^n is 0.5^n (i.e. P(1/10) = 0.5^1, P(10/100) = 0.5^2, etc...). Now when you choose an envelope, you discover that the value inside is 10^n. The probability that you are dealing with the 10^(n-1)/10^n envelope y) = log($1,000,000) = 6 is 6.531/6 = 108.9%, and the chooser would stay with $1,000,000.

EDIT: I think it's the same situation. Once you pick the envelope, you know you're dealing with either the smaller pair or the larger pair, and the calculations can be carried out as above. - 29 Mar '07 18:34

Oops, wrong again. The probability that you are dealing with the 10^n/10^(n+1) envelope has changed because you have to take into account all the other envelopes in the box. Now we have:*Originally posted by PBE6***Oops!! Misread the question...I thought only one pair of envelopes was in the box at once. I'll try again with the correct scenario.**

EDIT: I think it's the same situation. Once you pick the envelope, you know you're dealing with either the smaller pair or the larger pair, and the calculations can be carried out as above.

P(picked 10^n/10^(n+1) envelope pair) = P(10^n/10^(n+1)) + P(10^(n+1)/10^(n+2)) + ...

= 0.5^(n+1) + 0.5^(n+2) + ...

= 0.5^(n+1)/(1-0.5)

= 0.5^n

The new normalized probabilities are now:

P(switch to 10^(n+1)) = 0.5^n / (0.5^n + 0.5^n) = 0.5

P(switch to 10^(n-1)) = 0.5^n / (0.5^n + 0.5^n) = 0.5

And the expected values change as well:

E(switch to 10^(n+1)) = 10^(n+1) * 0.5

E(switch to 10^(n-1)) = 10^(n-1) * 0.5

E(switch) = 10^(n+1) * 0.5 + 10^(n-1) * 0.5

= 0.5* 10^(n-1) * (100+1)

= 50.5 * 10^(n-1)

= 5.05 * 10^n

E(stay) = 10^n

So it's still advantageous to switch every time, as above. - 29 Mar '07 18:46

I'm curious how confident you are in your math.*Originally posted by PBE6***Oops, wrong again. The probability that you are dealing with the 10^n/10^(n+1) envelope has changed because you have to take into account all the other envelopes in the box. Now we have:**

P(picked 10^n/10^(n+1) envelope pair) = P(10^n/10^(n+1)) + P(10^(n+1)/10^(n+2)) + ...

= 0.5^(n+1) + 0.5^(n+2) + ...

= 0.5^(n+1)/(1-0.5)

= 0.5^n

The new normalized pr ...[text shortened]...

= 5.05 * 10^n

E(stay) = 10^n

So it's still advantageous to switch every time, as above.

Pretend you were actually playing this game, and pulled out a pair of envelopes, one of which has $1,000,000,000 (10^9).

What are the chances of the other envelope having $10,000,000,000?

Would you bet $900,000,000 for a chance to win $9,000,000,000? - 29 Mar '07 19:46

Aha! Goofed again. However, the answer that I will give to your question will remain the same.*Originally posted by richjohnson***I'm curious how confident you are in your math.**

Pretend you were actually playing this game, and pulled out a pair of envelopes, one of which has $1,000,000,000 (10^9).

What are the chances of the other envelope having $10,000,000,000?

Would you bet $900,000,000 for a chance to win $9,000,000,000?

First off, I just drew myself a very enlightening picture, as follows:

- draw a square, this represents all the possibilities in the scenario

- draw a vertical line down the middle...the region on the left represents the probability that only the 1/10 envelope is present, the region on the right represents everything else

- draw a horizontal line through the middle of the region on the right...the region on top represents the probability that only the 1/10 and 10/100 envelopes are present, the region on the bottom represents everything else

- continue by dividing the region that represents "everything else" in half, adding one set of envelopes to one side, and calling the other side "everything else", ad infinitum (it creates a spiral pattern)

If you draw this diagram, you can see that if you have selected the sum 10^n you are either in the region where this sum is the maximum, or you are in the region where there is a greater sum present (the rest of the smaller rectangles). These two regions are the same size. Now, if you are in the region where 10^n is the maximum, there is only one choice so the entire 50% probability is assigned to switching to a lower sum. If you are in the other region, there is an equal chance that you have chosen either pair containing 10^n. Therefore, 25% is assigned to each possibility. Overall, the probability that you will switch to a lower value is 50% + 25% = 75%, and the probability that you will switch to a higher value is 25%. This is the same answer I get churning out the sum of the infinite series 0.5^(n+1) + 0.5^(n+2) + ...

The expected values then become:

E(switch) = 0.25 * 10^(n+1) + 0.75 * 10^(n-1)

= 2.5 * 10^n + 0.075 * 10^n

= 2.575 * 10^n

E(stay) = 10^n

So again, the best strategy is to switch. Now, as I mentioned in my first post, my desire for more money is more likely to follow a log function than a linear one, i.e. the relative utility of the money (my criterion of choice) will be the log of the value. If the relative increase in the utility of the money is above a certain threshold, I will switch. If it is not, I will not switch. Of course there is always a chance to get a much larger sum, but the amount of money being risked also increases with increasing "n". This system allows us to answer the question "when is enough enough?".

(If you reject this strategy, then I would always switch. Even if I lost the gamble, $100,000,000 would come in quite handy to me.) - 29 Mar '07 19:58

who in their right mind would risk 1 billion dollars for anymore money?*Originally posted by richjohnson***I'm curious how confident you are in your math.**

Pretend you were actually playing this game, and pulled out a pair of envelopes, one of which has $1,000,000,000 (10^9).

What are the chances of the other envelope having $10,000,000,000?

Would you bet $900,000,000 for a chance to win $9,000,000,000?

It would virtually be impossible to spend 1 billion dollars in non-business related expenses so the choice is obvious.

Take the Billion and retire. - 29 Mar '07 20:07

If there's $1B in the envelope you open, then you're guaranteed that there's a minimum of $222,222,221 if you switch. That's not half bad either, but why not risk it and see if can get some serious bucks.*Originally posted by uzless***who in their right mind would risk 1 billion dollars for anymore money?**

It would virtually be impossible to spend 1 billion dollars in non-business related expenses so the choice is obvious.

Take the Billion and retire. - 29 Mar '07 21:27 / 1 edit

So going back to original question*Originally posted by PBE6***So again, the best strategy is to switch.**

"If your answer is "always swap envelopes", then why not take the other envelope in the first place?"

If 2 of us are playing this game, and you get the contents of one envelope and I get the contents of the other envelope, do we both want to switch?

Can it always be better for both of us to switch

Is it only better to switch if we get to see the value before making the choice? - 30 Mar '07 14:22

you'd make good fodder for deal or no deal.*Originally posted by leisurelysloth***If there's $1B in the envelope you open, then you're guaranteed that there's a minimum of $222,222,221 if you switch. That's not half bad either, but why not risk it and see if can get some serious bucks.** - 30 Mar '07 15:58 / 1 edit

I think the answer is "yes, both players will still want to switch." As I recall, probabilities only have to add up for a single observer with one set of information, they do not sum between two observers with two different sets of information regarding the same situation. So it's still perfectly reasonable for each individual to want and switch. What will happen in real life is that the net gain for one player will be a net loss for the other player and vice-versa, and by symmetry it will balance out in the long run. But this is now a zero-sum game between two players, significantly different than the question as originally posed.*Originally posted by aging blitzer***So going back to original question**

"If your answer is "always swap envelopes", then why not take the other envelope in the first place?"

If 2 of us are playing this game, and you get the contents of one envelope and I get the contents of the other envelope, do we both want to switch?

Can it always be better for both of us to switch

Is it only better to switch if we get to see the value before making the choice?

EDIT: Forgot to respond to the other questions...

In my original answer, I said that if you second-guess yourself every time you make a decision and keep oscillating back and forth, you'll never actually make a decision. The expected value of this is simply 0, so it's better to pick an envelope and then switch than to do nothing at all.

As for the second part of the question, if we never see the value of the money inside the envelope then we really haven't made a selection yet. No information flows from simply selecting an envelope, we can only make a proper decision once we open it.