- 19 Mar '09 11:18Ok, this is a sort of old paradox and I came up with a possible solution. Since this solution is at odds with the general take on this problem, it's probably wrong and you can tell me where I'm wrong.

You are shown two envelopes (A and B) and you're told that one of them contains the double of the amount of the other.

You open an envelope (wlog we can label it A) and find the amount x. You're then given the choice to switch or not. What would you do? If your answer is "I should always switch", then why isn't it indifferent to just take B directly without opening it? - 19 Mar '09 11:40 / 5 editsMmm... I require that the values are multiples of cents and the value of x expressed in cents is even (or you'd know the other envelope doesn't have half). Instead of cents we could use grids of 0.00000000000000000000000000001 USD (or as small as you'd like), that doesn't change the results.

My answer just doesn't work if the possibilities within any given sub-interval are infinite.

Edit - It's not looking good so far, is it?

Edit - Obviously, in this version, if it's not even when expressed in the lowest possible unit, then you should always switch because you know the other envelope cannot have half. Still, it's not obvious why the answer is "indifference" when the value is even.

Edit - Mmm... But then there is information in the value being even. Which leads to preferring to "keep it" if even... - 19 Mar '09 13:06 / 1 edit

what is the question exactly? our goals in this problem require subtle distinction: probabilistically, provided all possible choices are "even," we haven't gained any new information the would change the probability of (wlog) A being more valuable than B. so if the goal is to "find the biggest envelope" then you might as well flip a coin, and switch or not switch. i don't think it matters, since we are not getting enough information to "remove a possible case" (unless i'm missing something important), as is so important in bayes theorem problems.*Originally posted by Palynka***Ok, this is a sort of old paradox and I came up with a possible solution. Since this solution is at odds with the general take on this problem, it's probably wrong and you can tell me where I'm wrong.**

You are shown two envelopes (A and B) and you're told that one of them contains the double of the amount of the other.

You open an envelope (wlog we always switch", then why isn't it indifferent to just take B directly without opening it?

HOWEVER, if our goal is to make our expected return in the game (i.e. amount of money we expect to win) as large as possible, a little investigation is in order. if the envelope you chose has value x, then the other envelope is either 2x, or x/2, with a 50% chance of each possibility. but then your expected value for switching envelopes is (2x + x/2)/2 = (5/4)x. but then your expected outcome is better than the known dollar amount from the envelope you originally picked. so you should always switch.

but this doesn't mean the other envelope is more likely to have more money in it than the envelope you currently know! it really has more to do with the fact that doubling your money by switching outweighs the negative consequence of cutting your money in half, provided the % chance of those possibilities are equal. in other words, doubling x has a larger net gain than the net loss of cutting x in half. - 19 Mar '09 17:44 / 1 edit

Ah, but the key is that you cannot have a uniform distribution over an infinite set. So what are you calculating your expected value based on?*Originally posted by Aetherael***HOWEVER, if our goal is to make our expected return in the game (i.e. amount of money we expect to win) as large as possible, a little investigation is in order. if the envelope you chose has value x, then the other envelope is either 2x, or x/2, with a 50% chance of each possibility. but then your expected value for switching envelopes is (2x + x/2)/2 = qual. in other words, doubling x has a larger net gain than the net loss of cutting x in half.**

You need to define a prior, update it with the information that the first envelope gives you and then calculate the posterior. - 20 Mar '09 00:12 / 1 edit

is still contend that you must re-examine what your goal is in this "game."*Originally posted by Palynka***Ah, but the key is that you cannot have a uniform distribution over an infinite set. So what are you calculating your expected value based on?**

You need to define a prior, update it with the information that the first envelope gives you and then calculate the posterior.

if you are trying to figure out which envelope has the most money, then i agree after a little thought that indeed (since there is a smallest element in the set) bayes will apply and the question of whether or not you are holding the largest envelope is an interesting one to tackle. however, i don't have the time or inclination to do so at the moment but i would love to see your ideas.

on the other hand, if you are trying to maximize your outcome over the long run of playing this game say, 1000 times (which is usually what questions like this are designed to examine)... then switching leads to the highest expected value. you're right that it won't be EXACTLY (5/4)x, but the infinitude of possible values the envelopes might have suggests that effect of an even dollar amount in the envelope would cause only negligible change on the probabilities of the second envelope being "higher" or "lower." this negligible change certainly wouldn't counteract the (5/4)x calculation, in my estmation.

in fact, if the posterior probability shift WERE to counteract the winnings potential of the doubled envelope, the probabilities would have to shift to a point where it was twice as likely that the envelope you originally picked is the larger envelope. this scenario would give an expected value of x if you switch envelopes, and therefore the choice would be arbitrary. any smaller shift in probability would favor a "switch," and any bigger shift in probability (making it very likely that the envelope you chose was the larger) would favor a "don't switch" choice. i haven't done the calculations your proposed in your last post, but i find it hard to believe that the change in probability thanks to bayes theorem would be that substantial, given the infinite set of possibilities - 20 Mar '09 01:01 / 1 edit

I'll contend that the strategies are indifferent, and try to show why, but I'll wait to see if someone takes a crack at it.*Originally posted by Aetherael***is still contend that you must re-examine what your goal is in this "game."**

if you are trying to figure out which envelope has the most money, then i agree after a little thought that indeed (since there is a smallest element in the set) bayes will apply and the question of whether or not you are holding the largest envelope is an interesting one to tac bayes theorem would be that substantial, given the infinite set of possibilities

If you're curious, try writing down a MATLAB program where first A and B quantities are set and then you play a strategy of "always open and switch". Then compare it with a strategy of "choose randomly and stick with it". After thinking about it in this way, does the strategy of switching give you a higher expected value? - 20 Mar '09 08:39

i understand your point that the final outcome SEEMS to be irrelevant in terms of the choice you make. since the dollar amounts are predestined, you're making a "random" choice either way, why wouldn't it be the same total outcome regardless of what you do?*Originally posted by Palynka***I'll contend that the strategies are indifferent, and try to show why, but I'll wait to see if someone takes a crack at it.**

If you're curious, try writing down a MATLAB program where first A and B quantities are set and then you play a strategy of "always open and switch". Then compare it with a strategy of "choose randomly and stick with it". After thin ...[text shortened]... ng about it in this way, does the strategy of switching give you a higher expected value?

but consider for a moment the fact that given any finite dollar amount that you see, when you are confronted with the choice between possibly doubling it, or possibly cutting it in half, in terms of dollar amounts you have more to gain than to lose. say you open the envelope and there is $1024 in it. now your choice is a 50/50 chance of either winning $1024 more dollars, OR losing $512. and you are only allowed to make the decision once - no "double or nothing" choices afterwards, etc. i haven't done the MATLAB program as you suggested, but i believe that it will follow my predicted results, PROVIDED you keep track of two very important sums along the way. the running total sum of the envelopes you see, and the running total sum of the envelopes you don't see. that is somewhat logically different than the program you suggested, but i think gets at what i'm trying to describe as the reason why switching is advantageous.

for instance, during the simulation, imagine $16 is chosen 100 times. half those times there will be $32 in the other envelope, half the time there will be $8 in the other envelope. so we see a total of $1600, but if we add up all the money in the envelopes we don't see, it's $32*50 + $8*50 = $1600 + $400 = $2000 in the unseen envelopes. so why didn't we switch?

it seems that when we pin down a finite dollar amount on one of the envelopes, the choice favors switching... and i think the difference in our reasoning may have something to do with the fact that in your setup of the problem the set of possible dollar values is only**semi**-infinite? we are dealing with a set that has a smallest value, but then are disallowed from having the smallest value be the envelope we choose. or perhaps the infinitude of the set (lack of a highest value) is causing a logical error on MY part. i'm not certain, and want to assure you i am not a person who is unwilling to listen or change - in fact i will be happy to be proven wrong!

i'm really just musing here and will continue to dwell on this question, and perhaps will find the time to run a simulation for future truth-seeking. - 20 Mar '09 10:24 / 1 edit

No problem, Aethereal. Like I said, the problem has been around for some time now.*Originally posted by Aetherael***and i think the difference in our reasoning may have something to do with the fact that in your setup of the problem the set of possible dollar values is only [b]semi**-infinite? we are dealing with a set that has a smallest value, but then are disallowed from having the smallest value be the envelope we choose. or perhaps the infinitude of the set (la on this question, and perhaps will find the time to run a simulation for future truth-seeking.[/b]

The gist of it, is that a truly frequentist view of the problem says that such a problem is improperly stated (i.e. there is no uniform over an unbounded set) and there is no "paradox" because there can be no such bet.

A (I'll argue naive) Bayesian interpretation says that it's a paradox in the sense that if you take any envelope, you know that you would switch no matter what the amount in the envelope, so you should be indifferent to switching with or without opening the envelope. But if you don't open the envelope and switch, a similar reasoning for the other envelope and so on will mean that you'll switch indefinitely.

I'll try to address what information can the amount give us and defend that the update on the prior will be such that would make the decision maker indifferent between the choice of envelopes even after opening the first.

The mistake in the (naive) Bayesian approach would then be that the calculations use the prior to calculate expected gains, when they should be using the posterior. - 20 Mar '09 23:18The primary reason why this problem seems to have an anti-intuitive answer is that we have infinite possibilities.

**GOAL**- Maximize amount of monetary gain.

**PRE-PICK ANALYSIS**- There are infinite possibilities, given the vague nature of the test. All we know is we have a 50% chance of picking the rich envelop.

**POST-PICK ANALYSIS**- We are now left with 2 possibilities. Either the second envelop contains double or half the money. Since we have a 50% of either, we go to expected return of a switch versus a keep.

If we are right to switch, we gain 100%.

If we are wrong to switch, we lose 50%.

The average return is a 25% profit.

**WHAT IF...**- Supposing though we had picked the other envelop to begin with? We would be in the same place, only with half or double the money. However, the situations are not the same, they are merely indistinguishable at the time we decide to switch (or not).

**EXCEPTION**Suppose we are informed that the amount is an ordinal number of dollars (or cents even). If we pick an envelop with an add number of dollars, we would therefore know that it is the low envelop and switch.

**PARADOX IF EVEN..**- If it is even, we cannot be sure. We can work with the assumption that the person running this would make both an even number of dollars to prevent this possibility, but then we are able to chain it up, which itself leads to a new paradox not dissimilar to the "Unexpected Hanging" paradox, whereby a man is told that he will be hung next week on a day he does not expect it.

The man eliminates Friday as that day, and proceeds to logically conclude via chain logic that he won't be hung at all. Thus it comes as a surprise when the executioner comes for him as promised.

Same principle, only with powers of 2... However, it is different in that we don't know what the reasoning is of the person behind the envelops, nor how deep they reason it.

Well enough rambling for now. Very interesting paradox indeed though.

(The new information gleaned is a specific monetary amount, btw. So it is not impossible that revealing this changes the picture. It's the how which isn't exactly clear..) - 21 Mar '09 13:35 / 2 editsAs above, it comes down to the idea that if you win, you win 100% more, but if you lose, you only lose 50%?

So if your envelope contained $100 and you switched, you are risking $50 for a shot at your original $50 back plus an additional $100....

I see the paradox of picking the other envelope first though.... - 21 Mar '09 16:31 / 1 edit

What probability distribution are you using to calculate the expected value?*Originally posted by deriver69***I have a huge difficulty in seeing this as a paradox.**

If the envelopes contain x, and 2x then...

if we are right to switch we gain x, if we are wrong to switch we lose x.

Overall we would expect to gain 1.5x

Edit - The paradox is in the fact that doing such a calculation means that you'd switch in every possible case. You should then be indifferent between switching before or after opening the envelope. The problem is that if you switch without opening then again a similar reasoning implies that switching is optimal in every possible case and the same indifference should arise, leading to swapping indefinitely. - 21 Mar '09 22:38 / 1 edit

I cant see the paradox still, and have had this mental block before. The problem is in using the percentages or proportions, reminds me a little bit of the error in Simpsons paradox. Using amounts the gain or loss is the same (e.g £50, £100 you will either gain or lose £50).*Originally posted by Palynka***What probability distribution are you using to calculate the expected value?**

Edit - The paradox is in the fact that doing such a calculation means that you'd switch in every possible case. You should then be indifferent between switching before or after opening the envelope. The problem is that if you switch without opening then again a similar reasoning ...[text shortened]... n every possible case and the same indifference should arise, leading to swapping indefinitely.

It clearly is not different switching every time as to sticking with the same one. You are in fact choosing one by not choosing it first.

Forget probability distributions. Imagine doing it 100 times and you end up with the larger amount 50 times and the smaller amount 50 times.

(50 * 2x + 50x)/100 = 150/100 = 1.5x