Imagine you have 100 boxes. In 1 of the boxes is a red ball the other 99 have a black ball inside
You are given 1 of the 100 boxes. Odds on havin the red ball are 100/1
Then you have to open boxes from the other 99
You open 98, all have been black
You now have your box and the other box from the remaining 99
what's more likely ? You chose the box with the red ball in it at the start (100/1) or the box from the other 99 (1/99) has the red ball in
Is it now a 50/50 chance with these 2 boxes which one has the red ball ??
Do you go with the 100/1 chance or the 1/99 chance
Is it 50/50 ?
I have to vote 50/50 at this point. All we have are 2 boxes. One with Black ball and one with the Red ball.
It went from 99 to 1 all the way down to 1 to 1. This is the only time in the experiment that the odds were 50/50.
I have been reading a book on a betting system and the author states: "...If the house wins 4 hands in a row, and you know this occurs 6.25% of the time, you also know that to reach that point the house has already won three hands in a row or 34.475%. You add the additional 6.25% to that amount, making it 40.625%. Then subtract the 40.625% from the 50% chance of the house win occurrences, which gives you a -9.375% chance against another win for the house. Add that amount to the 50% chance that the player will win a hand, and you get a 59.375% opportunity for a player win. It's just that simple!" :o
It's just that simple?
Bayes, RBH, Ego, Mr. Orr, and all other math guys, (forgive me if I didn't mention your name and you're a math guy) help me out here. This can't be right, right?
GLC
Not sure I understand, but if you have drawn 98 black, have 2 left and you draw them, it is
50% chance in first attempt draw red in this two boxes.
The chance to not draw the black in 98 draws must be smaller than 50%, the chance increase for every draw as the part of read increase (we do not put back the balls).
We should draw a red before 98 in most of the cases.
Its just that I watched a show (UK) and it said ALWAYS swap ure box, I just wondered what you guys thought
this seems to be some kind of monty hall problem?
cheers
hans
@ Twisteruk, I remember reading somewhere on the net regarding a game show where the odds of winning the prize behind the three doors was greater if you swopped the box... It was a curious thing backed my statistical evidence... I will try and find the article and post it. Had me scratching my head for quite a while... I even tried to design a system based on the phenomenon.
Quote from: Twisteruk on Sep 22, 10:29 PM 2012
Imagine you have 100 boxes. In 1 of the boxes is a red ball the other 99 have a black ball inside
You are given 1 of the 100 boxes. Odds on havin the red ball are 100/1
Then you have to open boxes from the other 99
You open 98, all have been black
You now have your box and the other box from the remaining 99
what's more likely ? You chose the box with the red ball in it at the start (100/1) or the box from the other 99 (1/99) has the red ball in
Is it now a 50/50 chance with these 2 boxes which one has the red ball ??
Do you go with the 100/1 chance or the 1/99 chance
Is it 50/50 ?
Paul, the problem as stated is ambiguous. The answer hinges on the statement I highlighted - you are given one of the boxes, but does whoever gave you the box know what ball is inside?
In other words, if you're never given the box with the red ball in, then you should always change, but if the box you're given is selected randomly, the chance is 50:50 and there's no point in switching.
Yep, this is a version of the Monty Hall problem.
Quote from: GLC on Sep 23, 01:12 AM 2012
I have been reading a book on a betting system and the author states: "...If the house wins 4 hands in a row, and you know this occurs 6.25% of the time, you also know that to reach that point the house has already won three hands in a row or 34.475%. You add the additional 6.25% to that amount, making it 40.625%. Then subtract the 40.625% from the 50% chance of the house win occurrences, which gives you a -9.375% chance against another win for the house. Add that amount to the 50% chance that the player will win a hand, and you get a 59.375% opportunity for a player win. It's just that simple!" :o
It's just that simple?
George,
No, it isn't that simple. ::)
In the first place, there's no such thing as a negative probability, so -9.375% is meaningless. Also, it would help if I knew something about what the game is? but in any case, you can't simply add and subtract probabilities in that way - classic gambler's fallacy.
Quote from: Bayes on Sep 23, 08:22 AM 2012
Paul, the problem as stated is ambiguous. The answer hinges on the statement I highlighted - you are given one of the boxes, but does whoever gave you the box know what ball is inside?
In other words, if you're never given the box with the red ball in, then you should always change, but if the box you're given is selected randomly, the chance is 50:50 and there's no point in switching.
Yep, this is a version of the Monty Hall problem.
My poor choice of words, sorry
No one knows which box contains the red ball
I would swap... look at it this way.
If you had the choice of taking the 99/100 boxes or the 1/100 boxes at the start what would you take? The 99/100 of course. So from my point of view your still picking from the 99% odds pile right down to the last box.
MM
Quote from: Bayes on Sep 23, 08:32 AM 2012
George,
No, it isn't that simple. ::)
In the first place, there's no such thing as a negative probability, so -9.375% is meaningless. Also, it would help if I knew something about what the game is? but in any case, you can't simply add and subtract probabilities in that way - classic gambler's fallacy.
The game is Blackjack. It's supposed to be a winning system for those who don't count cards. His betting system is based on betting against the dealer winning hands in a row. He says that out of a million hands the dealer wins 1 in a row 12.5% of the time. 2 in a row 12.5% of the time. 3 in a row 9.375% of the time. 4 in a row 6.25% of the time. His theory is basically to flat bet until the dealer has won 3 bets in a row and then increase you bets because the odds are in your favor that he won't win 4 in a row.
I quote from the book, "As you can see, the more straight wins the house gets in a row, the greater the chance the player will win the next hand. Based on this information, you can now construct a base-betting schema that will help you determine what your next bet needs to be based on outcomes." ::)
He sold his book for $12.95. Fortunately, I found it in the used book store and picked it up for a couple bucks which was more than it was worth to start with.
We should put together a book of systems gleaned from this forum and sell it. We could pre-qualify the book by stating it was an attempt to beat gambler's fallacy and random. We should be able to make enough money to keep the forum costs covered for quite a while. We have a ton of systems better than this guy's.
Not me. Maybe someone else. ^-^
The odds four the dealer winniing in arow many times can not be very different than EC on roulette.
If the dealer not can win 10 or more times in a row, a progression should be safe.
Quote from: mattymattz on Sep 23, 10:32 AM 2012
I would swap... look at it this way.
If you had the choice of taking the 99/100 boxes or the 1/100 boxes at the start what would you take? The 99/100 of course. So from my point of view your still picking from the 99% odds pile right down to the last box.
MM
Agreed :thumbsup:
Here's a betting system based on this theory.
Flatbet until you have 2 losses in a row and then increase your bet size using a D'Alembert until you win because after each loss, you have a better chance to win. As soon as you win, drop back to the flatbet until you have 2 losses in a row etc...
It should work in the long run if we have a better chance to win after the dealer has won 2 or more times in a row.
GLC
Sounds like the same thing as waiting until you have 6 Reds in a row and then bet a 2 or 3 step capped marty that the streak will end. I think it's called playing against the wheel by Brett Morton and others.
Whether it is 50/50 or 99/100 if you swap depends on HOW the boxes are opened.
In the problem you state "You open 98, all have been black". It is implied that you do not know which box has the red ball, and you open them randomly. You got lucky and did not chose the box with the red ball. Note that this only happen 2% of the time. In this case, it is 50/50 which of the 2 remaining boxes has the red ball, and SWITCHING DOES NOT MATTER.
If, on the other hand, someone who knows which box has the red ball chooses the boxes to open, and deliberately chooses boxes with black balls only to open, then you should switch as the remaining box has a 99% chance of having the red ball, while your original chosen box has only 1%.
===========================================================
Imagine you have 100 boxes. In 1 of the boxes is a red ball the other 99 have a black ball inside
You are given 1 of the 100 boxes. Odds on havin the red ball are 100/1
Then you have to open boxes from the other 99
You open 98, all have been black
You now have your box and the other box from the remaining 99
what's more likely ? You chose the box with the red ball in it at the start (100/1) or the box from the other 99 (1/99) has the red ball in
Is it now a 50/50 chance with these 2 boxes which one has the red ball ??
Do you go with the 100/1 chance or the 1/99 chance
Is it 50/50 ?
I just saw this Monty Hall Problem on the net and it intrigued me...so I searched to see if it had been covered here before...
Here's the explanation from business insider as to why you should always switch
link:://:.businessinsider.com/the-monty-hall-math-problem-2014-2 (link:://:.businessinsider.com/the-monty-hall-math-problem-2014-2)
Great response by everybody. This is definitely a Monty Hall problem,
though Twisteruk phrased it wrongly because someone else has to open the other boxes for you.
100 boxes may be too much for people to comprehend but lets say we use 3 boxes to understand it better.
2 of the boxes has red balls, 1 has black. You randomly choose one at the start and someone else opens another box and
reveals a red ball. Will switching increase the odds of getting the black ball? The instinctive response will be no as there are only 2
boxes left and will be 50/50. However, this situation is different, lets take a look at how the scenario will play out...
Box 1: Black Ball
Box 2: Red Ball
Box 3: Red Ball
1st scenario: You choose box 1, someone elses open either box 2/3, switching loses
2nd scenario: You choose box 2, someone else opens box 3, switching wins
3rd scenario: You choose box 3, someone else opens box 2, switching wins.
Thus, switching actually increases the odds of getting the black ball as above.