Could someone please explain to me the concept behind the following problem:

You're on a gameshow and you have to choose between three doors A, B, and C, one of which has a car behind it. You choose door C, but instead of the gameshow host's showing you whether you were correct, he opens door A and shows you that it has nothing behind it, and lets you choose again between B and C. Should you reconsider and choose door B, or should you stay with door C?

I know that statistically you should switch and choose door B, but I don't understand why. Could someone explain this concept? I don't mind if you explain it mathematically, or logically, or using a combination of the two, but I simply don't get it.

Originally posted by wittywonka Could someone please explain to me the concept behind the following problem:

You're on a gameshow and you have to choose between three doors A, B, and C, one of which has a car behind it. You choose door C, but instead of the gameshow host's showing you whether you were correct, he opens door A and shows you that it has nothing behind it, and lets you ...[text shortened]... ng a combination of the two, but I simply don't get it.

Thanks in advance for your time.

There are other threads on this subject, but it basically boils down to the fact that the "host" always opens a door with nothing in it.

If you expand it out to 100 doors, where you choose one and the host opens 98 doors with nothing in them, it becomes more apparent that if you stick with your door, you have your original odds (1/n), but if you switch, your odds become (1 + number of opened doors)/(n - number of opened doors)

Originally posted by forkedknight There are other threads on this subject, but it basically boils down to the fact that the "host" always opens a door with nothing in it.

If you expand it out to 100 doors, where you choose one and the host opens 98 doors with nothing in them, it becomes more apparent that if you stick with your door, you have your original odds (1/n), but if you switch, your odds become (1 + number of opened doors)/(n - number of opened doors)

That concept actually makes some sense, about expanding your initial game field to 100 doors, but where did you get that formula for the probability after switching? Taking the 100 doors scenario, wouldn't it be .01 probability (staying with the door you chose originally) and .5 probability (switching)?

edit- Never mind; you were calculating odds, not probabilities. Thanks.

You pick a door (one of three). You will always have a one in three chance of winning.

Your host always takes away one of the losing doors. You don't know which.

It looks like it's now one in two (50:50) chances, but it's not. Here's why:

Your host removed a loser, but there were two chances to your one that it would be one of the other doors. It's still that way. You always have a one in three chance of guessing right and a two in three chances of guessing wrong.

You switch and join the two out of three group. The removal of one loser didn't change those odds.

If you want help believing it, but not understanding it. Think of this more extreme scenario:
A person gets you to pick one card out of a deck, saying you're trying to get the ace of spades. You pick one, but instead of turning it over, he reveals the 50 cards left in the deck that aren't the ace of spades, leaving one that hasn't been turned over in the deck, and the one you chose to start with. It's fairly clear that the one that was in the deck is more likely to be the ace of spades.
But anyway, the correct answer to the gameshow problem is not you should switch, but you should stick. Some assumptions that need adressing:
1) The gameshow host knows which door the car is behind. If he doesn't, then the probability IS 50/50.
2) The gameshow host always opens a door. If this is not true then one must consider
3) The gameshow host wants you to win. In which case there's 100% chance of winning with change. If he doesn't, there's 100% chance of winning with stick.

I usually use the example forkedknight gave, but if you want another way of thinking about it is to think about the probability of winning by NOT switching in a sort of frequentist way (imagine you repeat the experiment many times).

If you NEVER switch, then how many times will you win? Exactly 1/3 of the times.
If you ALWAYS switch, then how many times will you win? It must be the remainder as there are no more options, so it must be 2/3 of the times.

PS: All this is again assuming he always opens a door without a prize as other comments mention.

if there are n doors, then your initial chances of guessing right were 1/n. the host removes all losing doors with one remaining and asks you to choose between your initial choice, and the door that is left.

in other words, he's asking you now to consider this question: "did you guess RIGHT the first time? or did you guess WRONG the first time?" the probability you guessed right was 1/n, so the probability you guessed wrong was (n-1)/n. switching doors is akin to saying "my initial choice was wrong."

and in the classic version of this problem, n=3 so your initial guess has a probability of 1/3 of being right, whereas 2/3 you were wrong... so switching doubles your chances of winning

A derogatory term for a tabletop RPG that is far too easy and therefore poses no challenge to its players, usually a 'Monty Haul' game quickly becomes boring once it's players become the most powerful things in the game world.
Named for the host of "the Price is Right" a TV quiz show.

Originally posted by Palynka I usually use the example forkedknight gave, but if you want another way of thinking about it is to think about the probability of winning by NOT switching in a sort of frequentist way (imagine you repeat the experiment many times).

If you NEVER switch, then how many times will you win? Exactly 1/3 of the times.
If you ALWAYS switch, then how many times ...[text shortened]... PS: All this is again assuming he always opens a door without a prize as other comments mention.

Of all the explanations I've heard of why you should always switch, this is the most straightforward. Not sure why I've never seen it before now.

Originally posted by Ichibanov Of all the explanations I've heard of why you should always switch, this is the most straightforward. Not sure why I've never seen it before now.