1. Standard memberPBE6
    Bananarama
    False berry
    Joined
    14 Feb '04
    Moves
    28719
    30 Mar '07 16:32
    Food for thought:

    http://consc.net/papers/envelope.html

    This paper states that problems like this with an infinite expected value, even with a proper distribution function to draw values from, lead to an apparent paradox that is merely counter-intuitive due to the "fuzzy arithmetic of infinity". In this respect it is similar to the St. Petersburg paradox.

    http://en.wikipedia.org/wiki/St._Petersburg_paradox
  2. Joined
    28 Nov '05
    Moves
    24334
    30 Mar '07 18:48
    Originally posted by PBE6
    As for the second part of the question, if we never see the value of the money inside the envelope then we really haven't made a selection yet. No information flows from simply selecting an envelope, we can only make a proper decision once we open it.
    But what are you (in your strategy) doing with the information?
    Nothing.
    You always switch.
    Why do you need to see the contents?


    Having a strategy implies you get more than go.
    If you only ever get 1 shot, then in real life it does become more about utility and personal circumstances. And this is where the tension is in game shows like Millionaire and Deal or No Deal.
  3. Joined
    28 Nov '05
    Moves
    24334
    30 Mar '07 18:54
    Comparing this puzzle with the St. Petersburg paradox,

    if we (notionally) mark the envelopes with A for the lower value and B for the higher value

    Is the expected return from always choosing A infinite?
    Is the expected return from always choosing B infinite?

    if both answers are yes, then it makes no difference whether you switch or not....does it?
  4. Standard memberPBE6
    Bananarama
    False berry
    Joined
    14 Feb '04
    Moves
    28719
    30 Mar '07 19:23
    Originally posted by aging blitzer
    But what are you (in your strategy) doing with the information?
    Nothing.
    You always switch.
    Why do you need to see the contents?


    Having a strategy implies you get more than go.
    If you only ever get 1 shot, then in real life it does become more about utility and personal circumstances. And this is where the tension is in game shows like Millionaire and Deal or No Deal.
    That is exactly what I would do with the information - use it to determine the utility of the money to me given my circumstances, as stated previously.

    The game shows you mentioned do not operate in the same way as this problem. Here the possible sum can be infinite, on both game shows the maximum values of the prizes are known (among other details). If that is the case, there is no longer a perceived paradox.
  5. Joined
    28 Nov '05
    Moves
    24334
    30 Mar '07 20:38
    Originally posted by PBE6
    That is exactly what I would do with the information - use it to determine the utility of the money to me given my circumstances, as stated previously.
    Yes, in real life, fine.
    But back in the problem where the expectation is infinite,

    is the expected return from always choosing A infinite?
    is the expected return from always choosing B infinite?
    does switching make any difference...in the long run
  6. Standard memberDeepThought
    Losing the Thread
    Quarantined World
    Joined
    27 Oct '04
    Moves
    87415
    31 Mar '07 02:591 edit
    Originally posted by aging blitzer
    Yes, in real life, fine.
    But back in the problem where the expectation is infinite,

    is the expected return from always choosing A infinite?
    is the expected return from always choosing B infinite?
    does switching make any difference...in the long run
    The problem is because the sum for the expected amount of money is infinite. This means that working out expected returns is going to give nonsense. Let's just find the expected number of envelope pairs. The probability there is exactly one pair is 1/2, that there are exactly 2 is 1/4 and so on so the expected number of pairs is given by:

    pairs = 1/2 + 2/4 + 3/8 + ... + n/2^n + ... = S(1/2)

    where S(x) = sum {n=0 ... infinity} nx^n

    Using the identity x*d/dx x^n = n*x^n we can work out S(x):

    S(x) = x d/dx sum {n=0 ... infinity} x^n

    The sum was worked out centuries ago and comes to 1/(1-x), doing the algebra gives:

    S(x) = x d/dx 1/(1 - x) = x / (1-x)^2

    And we can put in x = 1/2 to find that the expected number of envelope pairs is 2. Let's just assume that that is the actual number of envelopes in the box and use the following strategy:

    If we see $1 we swap as the other envelope is known to have $10 in.

    If we see $10 then as we expect 2 pairs of envelopes it is a 50% chance for $100 so we may as well go for it as losing $9 isn't a disaster.

    If we see $100 or more then we expect that that is the largest amount of money present and keep it.
  7. Joined
    15 Feb '07
    Moves
    667
    02 Apr '07 08:56
    The problem posted does have a finite answer, despite the tie-in with the St Petersburg principle.

    I play this game and pull out a pair, and open an envelop. Depending on the amount, I may have eliminated some possibilities of the number of heads (or I may not..)

    Anyways, expected payoffs, (compared to the original amount).

    If I stay, then my payoff is 1.
    If I swap the envelop with $1, then my payoff is 10, as 1 is the low amount. (100% chance of improving)

    If I swap the envelop with $10, then what are my odds of improving?

    Well, I have a 50% that that was the only pair, in which case it's low.
    The other 50% of the time, I have equal odds of high or low, which means I have a 25% chance overall of improving my sum.

    That makes the expected payoff 10*25% + 0.1*75% or 2.575.

    Oddly enough, this same payoff holds for any greater value, with the key difference being that I know more and more heads were tossed.

    The determining factor in whether to swap or not seems to be the multiplier, which breaks even at 3.

    If it's less than 3, then you only swap with the low amount.
  8. Joined
    12 Mar '03
    Moves
    44411
    02 Apr '07 09:32
    Originally posted by geepamoogle
    The problem posted does have a finite answer, despite the tie-in with the St Petersburg principle.

    I play this game and pull out a pair, and open an envelop. Depending on the amount, I may have eliminated some possibilities of the number of heads (or I may not..)

    Anyways, expected payoffs, (compared to the original amount).

    If I stay, then my pa ...[text shortened]... plier, which breaks even at 3.

    If it's less than 3, then you only swap with the low amount.
    "If it's less than 3, then you only swap with the low amount."

    Not only didn't I understand the rest ofyour post, but surely this conclusion makes no sense. Which is 'the low amount'?
  9. Joined
    15 Feb '07
    Moves
    667
    03 Apr '07 05:541 edit
    Originally posted by Mephisto2
    "If it's less than 3, then you only swap with the low amount."

    Not only didn't I understand the rest ofyour post, but surely this conclusion makes no sense. Which is 'the low amount'?
    The problem as given works in this general way. For each pair of envelops there is a multiplier at work.. In this case, it's 10.

    So once you are aware of the amount in one envelop, you know it's either 10 times the amount in its mate, or else it;s mate is 10 times as much.

    If I find that the amount is the low amount ($1 in this case), then I know the other has $10.

    If the amount I find is anything greater, then it ends up I only have a 1-in-4 chance that the other envelop is greater, but but payoff is 10 times higher, which makes the rational choice to swap, especially in a repeat run game. This is also true in the event you picked the other one in the first place, although it is counterintuitive that that rule would maximize average payoff, but the logic is sound nonetheless.

    But let's assume for the moment that the amount only triples each time, so that the first pair of envelops has $1 and $3 respectively, the second has $3 and $9, the third pair $9 and $27, etc..

    And suppose you find $9 in the envelop. You still have 1-in-4 odds of improving your lot by picking the other, and will win $27 in this case. The other 3-in-4 times you'll wind up with $3 instead.

    So your expected winnings for swapping is ($27 + 3*$3) / 4 or $9. So either choice is equally acceptable rationally. Of course, if you get the $1, then you'll always swap for $3.

    Now what if it only doubles each time. ($1/$2 - $2/$4 - $4/$8 - etc)

    And suppose you get a $4 envelop. By swapping you can expect to get $8 a quarter of the time, and $2 the rest of the time, for an expected payout of ($8 + 3*$2)/4 = $3.50 which means you'll get less on average than what you can get now. Of course, if you get the $1 envelop here, swapping will always net you $2, because $1 is always the 'low amount'.
Back to Top

Cookies help us deliver our Services. By using our Services or clicking I agree, you agree to our use of cookies. Learn More.I Agree