Go back
Simple gambling problem

Simple gambling problem

Posers and Puzzles

Vote Up
Vote Down

Originally posted by mtthw
You do realise this is the definition of probability, don't you? This means that the probability is 2/3. This is what the question is asking.
I get that. Really I do. But if you apply that in application, it fails. 2/3 is saying that 2 out of 3 times I lose, right? Well, what happens on the first and only time? That's still a 50/50 chance.

Are you saying that if I lose the first time, and a silver side shows again, I have a 50/50 chance then?

Vote Up
Vote Down

Originally posted by brobluto
Are you saying that if I lose the first time, and a silver side shows again, I have a 50/50 chance then?
No, you always have a 33% chance to win.

Vote Up
Vote Down

Originally posted by brobluto
I get that. Really I do. But if you apply that in application, it fails. 2/3 is saying that 2 out of 3 times I lose, right? Well, what happens on the first and only time? That's still a 50/50 chance.

Are you saying that if I lose the first time, and a silver side shows again, I have a 50/50 chance then?
You say it's a one time event, well, imagine standing at the back of a line of people all participating in the card game. Each person has one bash at the game. As the queue grows shorter you start to notice that 2/3 of the people are handing over money while only 1/3 are collecting. Aren't these all one time events? I don't know about you but I would step out of the line at that point.

Vote Up
Vote Down

Originally posted by brobluto
I get that. Really I do. But if you apply that in application, it fails. 2/3 is saying that 2 out of 3 times I lose, right? Well, what happens on the first and only time? That's still a 50/50 chance.

Are you saying that if I lose the first time, and a silver side shows again, I have a 50/50 chance then?
No, I'm saying that it's not a 50/50 chance, it's a 2/1 chance. Yes, you'll either win or lose. But it's completely misleading to say it's 50/50. Because to me, 50/50 means "one or the other with equal likelihood". Tossing a coin is 50/50. If by 50/50 you just mean "one or the other", then fair enough. Personally I don't think that's a useful concept.

We're not arguing about the calculation any more, though, so I'll leave it at that.

Vote Up
Vote Down

Originally posted by brobluto
I get that. Really I do. But if you apply that in application, it fails. 2/3 is saying that 2 out of 3 times I lose, right? Well, what happens on the first and only time? That's still a 50/50 chance.

Are you saying that if I lose the first time, and a silver side shows again, I have a 50/50 chance then?
Let's see if we can start from somewhere we agree on something, and work backwards to find the disagreement. You have no problem with the concept of probability applied to a large number of trials, so let's start there.

Say we run an experiment several times that allows only two outcomes, A and B. The probabilities of each outcome occurring are determined from their frequency during the experimental run. We designate these probabilities as P(A) and P(B), with the identity P(A) + P(B) = 1. We also run the experiment several more times, do some statistical analysis, and find that the probabilities are in excellent agreement each time. We are quite certain that in a series of N experiments, the number of times we get A will be N*P(A), and the number of times we get B will be N*P(B).

Now suppose we are interested in how many A's we can get in a row, starting with the first outcome. If we run M experiments, we predict the probability of getting all A's will be P(A)^M. If we run M-1 experiments, we predict the probability of getting all A's will be P(A)^(M-1). We can keep reducing the number of experiments all the way down to 1, in which case we predict the probability of A coming up on the first try is P(A)^1 = P(A).

OK, now we'll see where the disagreement lies.

(a) Do you accept that the probability of getting M A's in a row can be described as P(A)^M?

(b) Do you accept that the probability of getting M-1 A's in a row can be described as P(A)^(M-1)?

(c) Do you accept that the probability of getting 1 A in a row can be described as P(A)^1 = P(A)?

1 edit
Vote Up
Vote Down

Originally posted by PBE6
New problems! Exquisitely simple, but hopefully illuminating.

1) What is the probability of a standard roulette wheel (36 numbers, plus 0 and 00 for a total of 38 possibilities):

(a) landing on 7 on the first spin?
(b) landing on 7 on the second spin, given that it already landed on 7 on the first spin?

2) What is the probability of:

(a) being dea ngle deck, if you are playing alone but the dealer has revealed one of his cards to be an Ace?
One more new problem! This one is called the Mr. Smith 3-Card Monty problem.

I have 3 cards: the Ace of Hearts, the Ace of Clubs, and a Joker. I pick 2 of these cards at random (each card is equally likely to be chosen, chance of any card being chosen is 1/3), and look at them. I then tell you that I hold at least one ace.

(a) What is the chance the other card I hold is also an ace?

(b) I now tell you that the ace I am speaking of is actually the Ace of Hearts. What is the chance the other card I hold is an ace?

Vote Up
Vote Down

Originally posted by PBE6
Let's see if we can start from somewhere we agree on something, and work backwards to find the disagreement. You have no problem with the concept of probability applied to a large number of trials, so let's start there.

Say we run an experiment several times that allows only two outcomes, A and B. The probabilities of each outcome occurring are determined ...[text shortened]... u accept that the probability of getting 1 A in a row can be described as P(A)^1 = P(A)?
I'm sure that the math is right.

My dilemma is: if I take the bet and bet $3, and the other side is not silver, I will win $3. If the other side is silver, I will lose $3. He's not going to give me $2 if I win, or only take $1 if I lose. The outcome will be silver or gold, 50/50.

Please don't try to apply the lottery or anything else again. I am not saying the outcome is win or lose, the outcome is silver or gold, and that's where the 50/50 comes in (on the number of possibilities it could be), not on the win or lose part.

Vote Up
Vote Down

Originally posted by brobluto
I'm sure that the math is right.

My dilemma is: if I take the bet and bet $3, and the other side is not silver, I will win $3. If the other side is silver, I will lose $3. He's not going to give me $2 if I win, or only take $1 if I lose. The outcome will be silver or gold, 50/50.

Please don't try to apply the lottery or anything else again. I am not ...[text shortened]... he 50/50 comes in (on the number of possibilities it could be), not on the win or lose part.
The fact that there are two outcomes, doesn't make them equally probable. Of course they are two, it's either gold or silver, but silver is more probable. The main task of probability theory is comparing the probability of events, isn't it?

Vote Up
Vote Down

Originally posted by brobluto
I'm sure that the math is right.

My dilemma is: if I take the bet and bet $3, and the other side is not silver, I will win $3. If the other side is silver, I will lose $3. He's not going to give me $2 if I win, or only take $1 if I lose. The outcome will be silver or gold, 50/50.

Please don't try to apply the lottery or anything else again. I am not ...[text shortened]... he 50/50 comes in (on the number of possibilities it could be), not on the win or lose part.
Just try answering (a), (b) and (c), will ya?

Vote Up
Vote Down

Originally posted by PBE6
One more new problem! This one is called the Mr. Smith 3-Card Monty problem.

I have 3 cards: the Ace of Hearts, the Ace of Clubs, and a Joker. I pick 2 of these cards at random (each card is equally likely to be chosen, chance of any card being chosen is 1/3), and look at them. I then tell you that I hold at least one ace.

(a) What is the chance the othe ...[text shortened]... m speaking of is actually the Ace of Hearts. What is the chance the other card I hold is an ace?
(a) 1/3

(b) 1/2

I think.

Vote Up
Vote Down

Originally posted by PBE6
One more new problem! This one is called the Mr. Smith 3-Card Monty problem.

I have 3 cards: the Ace of Hearts, the Ace of Clubs, and a Joker. I pick 2 of these cards at random (each card is equally likely to be chosen, chance of any card being chosen is 1/3), and look at them. I then tell you that I hold at least one ace.

(a) What is the chance the othe ...[text shortened]... m speaking of is actually the Ace of Hearts. What is the chance the other card I hold is an ace?
(a) 1/3 [AA AJ AJ]

(b) 1/2 [AA AJ]

If you are telling the truth both times.

2 edits
Vote Up
Vote Down

Originally posted by brobluto
(a) 1/3 [AA AJ AJ]

(b) 1/2 [AA AJ]

If you are telling the truth both times.
You get this one right. How come you don't get the original problem? It's the same logic, just there you have to look at sides instead of cards, because each side is defined regardless of the other side, hence choosing silver side is an event, instead of choosing SG card. 3 possible events. 3 is odd, so it can't be 50/50.

Vote Up
Vote Down

Originally posted by Green Paladin
(a) 1/3

(b) 1/2

I think.
Correct!

Vote Up
Vote Down

Originally posted by brobluto
(a) 1/3 [AA AJ AJ]

(b) 1/2 [AA AJ]

If you are telling the truth both times.
Correct!

Vote Up
Vote Down

Originally posted by PBE6
Correct!
So, would you agree that it would be a fair bet if I said that I hold the Ace of Hearts in my hand and I bet you even money that the other card is also an Ace? (assuming that I don't know what the other card is or course)

Cookies help us deliver our Services. By using our Services or clicking I agree, you agree to our use of cookies. Learn More.