Subject: Mathematical Probability Query From: Fmaj7 Date: 09 Oct 00 - 05:38 PM I've tried looking elsewhere without succees, and felt sure that there might be someone here who could help. Hope you don't mind me asking this. I remember hearing / reading that in a group of (I think) 23 people, there is a more than 50% chance that two of the group will share the same birthday. Absurd as it sounds on first glance, the maths make it quite clear. Unfortunately, I've forgotten the maths. If anyone could explain or point me to a site that explains, I'd be most grateful. Thanks Fmaj7 |
Subject: RE: BS: Mathematical Probability Query From: Liz the Squeak Date: 09 Oct 00 - 05:40 PM Well one way to check it is to see the birthday threads here! I share my birthday with another catter, if there are 21 others out there, then the odds are pretty good.... LTS |
Subject: RE: BS: Mathematical Probability Query From: GUEST,Ed Pellow Date: 09 Oct 00 - 05:44 PM Try looking here All is explained. I love elegant maths Ed |
Subject: RE: BS: Mathematical Probability Query From: Mrs.Duck Date: 10 Oct 00 - 04:35 PM I enjoyed that!! |
Subject: RE: BS: Mathematical Probability Query From: Ed Pellow Date: 10 Oct 00 - 04:50 PM I love this problem, and the fact that with 55+ people, it's nigh on certain that at least 2 people will share the same birthday. Does anyone know any other probability problems which initially seem counter intuitive? Ed
|
Subject: RE: BS: Mathematical Probability Query From: Mark Clark Date: 10 Oct 00 - 05:04 PM A racing team is preparing their car for a race. It's a one mile track and they want to observe it's performance in a test. They tell the driver he must drive around the track once and maintain an average speed of 60 MPH. During the first half of the lap the car has some problems and, when he is half way around the circuit, the driver realizes he's only averaged 30 MPH. What speed must the driver now average for the remainder of the track in order to average 60 MPH for the entire circuit? Have fun, - Mark |
Subject: RE: BS: Mathematical Probability Query From: Bert Date: 10 Oct 00 - 05:13 PM Nice one Mark. I won't spoil it by telling the answer. |
Subject: RE: BS: Mathematical Probability Query From: Jon Freeman Date: 10 Oct 00 - 06:08 PM By my reckoning, 1/2 a mile at 30 MPH would take him 1 minute but 1 mile at 60 MPH also takes 1 minute so he would have to do the second 1/2 mile in no time at all! Jon |
Subject: RE: BS: Mathematical Probability Query From: Jim the Bart Date: 10 Oct 00 - 06:36 PM Sorry, Jon. He still has a half mile to make up the difference. I think I know how he'd do it, but I'll wait to see if I'm correct. |
Subject: RE: BS: Mathematical Probability Query From: Jeri Date: 10 Oct 00 - 06:43 PM Bartholemew, Jon's disgusting ;-), but he's right. The guy has to drive the mile in one minute to average 60 MPH. His minute's up. |
Subject: RE: BS: Mathematical Probability Query From: Ed Pellow Date: 10 Oct 00 - 06:57 PM OK, The driver need to cover 1 mile in one minute. Over the first 30 seconds, he's averaged 30 mph, (which works out at 0.5 miles per minute) Therefore he's covered 0.25 of a mile. He needs to cover 0.75 miles within the next 30 seconds. So he needs to travel at 1.5 miles per minute, which works out at 90 mph. For some reason, (whilst it seems logical) I've got a feeling that my answer is wrong. Ed |
Subject: RE: BS: Mathematical Probability Query From: IvanB Date: 10 Oct 00 - 07:03 PM No, Ed, he's driven halfway around the circuit (1/2 mile) at 30 mph. Therefore he's used up his whole minute, and Jon's answer is correct. |
Subject: RE: BS: Mathematical Probability Query From: Jon Freeman Date: 10 Oct 00 - 07:14 PM OK, here is an old one: If a block of cheese weighs 10 pounds plus 1/2 of its weight, how much does it weigh? Jon |
Subject: RE: BS: Mathematical Probability Query From: Ed Pellow Date: 10 Oct 00 - 07:15 PM Ivan, You are of course right. I hate it when maths is put forward this way, as if to trick people. No wonder so many people don't like it, even though working through the probability of the original question is quite rewarding... Ed |
Subject: RE: BS: Mathematical Probability Query From: Bradypus Date: 10 Oct 00 - 07:23 PM I remember reading an interesting study on the birthday problem. In a football (soccer) match, there are 23 people on the pitch at kick-off (11 on each side, + the referee). Someone surveyed all the Premier League games on a given Saturday (maybe more than just the one division) and found that in almost exactly half of them two of the people on the pitch shared a birthday ! Bradypus |
Subject: RE: BS: Mathematical Probability Query From: GUEST,Murray MacLeod Date: 10 Oct 00 - 07:38 PM I remember programming my old Amstrad computer back in the early eighties to work out the birthday probabilities for all numbers from 2 to 367 (obviously if you have 367 people at least two of them MUST share the same birthday, whereas with 366 it is theoretically possible they could all have different birthdays. What was interesting was that at 80 people the computer returned the probability as certainty. Of course it was a Z-80 processor, maybe my Pentium would go much further, except you can't program these damned things, nowadays, it was much more fun back with the Apple 48k and the Sinclairs ......... Murray |
Subject: RE: BS: Mathematical Probability Query From: IvanB Date: 10 Oct 00 - 07:42 PM Jon, it, of course, weighs 20 pounds. |
Subject: RE: BS: Mathematical Probability Query From: catspaw49 Date: 10 Oct 00 - 07:43 PM Jon.........infinity. Spaw |
Subject: RE: BS: Mathematical Probability Query From: Amos Date: 10 Oct 00 - 08:03 PM x = 10+(1/2 x) .5 x = 10 x = 20 Proof: 20 = 10 + (20/2) A |
Subject: RE: BS: Mathematical Probability Query From: Marion Date: 10 Oct 00 - 09:27 PM Here's one where the obvious answer that everyone gives first is wrong, though not everybody accepts the right answer because it seems so counterintuitive. I once got into a long discussion on this on another board, and eventually won over those who argued against me, but let's see how it goes here. OK, you are shown three cups, one of which has a prize under it. You are asked to choose (and point out) one of the cups, but not lift it yet. After you have made your guess, the house lifts one of the cups that you didn't choose and shows you that it is empty. You are then offered the opportunity to choose again, this time for keeps. Should you stick with your original guess? Should you change to the other cup? Or does it not matter (i.e. your chances of winning are the same whether you change your guess or not)? Enjoy, Marion |
Subject: RE: BS: Mathematical Probability Query From: Amos Date: 10 Oct 00 - 09:49 PM Statistically you should switch. Never could figure why though.
A |
Subject: RE: BS: Mathematical Probability Query From: Troll Date: 10 Oct 00 - 09:52 PM Read this carefully. Which is heavier; a pound of gold or a pound of feathers? troll |
Subject: RE: BS: Mathematical Probability Query From: catspaw49 Date: 10 Oct 00 - 09:53 PM uh Marion? Shell games and Monty have no statistical probability outside of the dealer's choice. Spaw |
Subject: RE: BS: Mathematical Probability Query From: Jon Freeman Date: 10 Oct 00 - 10:00 PM Troll, that is mean but a pound of feathers weigh more (I think). Marion, I am temped to say 50/50 chance on the other 2 cups so sticking with the original is as good as anything but I have a horrible feeling that there is a cathc to this one. Jon |
Subject: RE: BS: Mathematical Probability Query From: Amos Date: 10 Oct 00 - 10:04 PM They're measured on different scales, I believe. But a kilogram of feathers and a kilogram of gold mass the same. And they weigh the same at rest. But they would probably accelerate diffferently through atmosphere and have different terminal velocities. I think. A |
Subject: RE: BS: Mathematical Probability Query From: Jeri Date: 10 Oct 00 - 10:11 PM Jon's right again - a pound can buy a lot more feathers weight-wise than gold. Marion, I'd say it wouldn't make a difference which of the two cups you chose. You have an equal chance of being correct no matter which cup you pick. You start off with a one-in-three chance, then go to one-in-two. You can switch cups - the probability is still 50%. |
Subject: RE: BS: Mathematical Probability Query From: Jon Freeman Date: 10 Oct 00 - 10:16 PM Slight deviation but another old classic: 3 men go into a restaraunt for a meal. The bill came to £30 so they each pay £10 each. The manager saw that these people were loyal regulars and asked the waiter to knock £5 off the bill. The waiter was dishonest and decided to give them £1 back each and pocketed the remaining £2 for himself. Now the men each paid £10 and got £1 back so they each paid £9 and the waiter took £2 but £9 x £3 = £27 and £27 + the £2 the waiter took = £29. What happened to the missing £1? Jon |
Subject: RE: BS: Mathematical Probability Query From: Troll Date: 10 Oct 00 - 10:22 PM Jon and Amos. The pound of feathers is HEAVIER.Gold is weighed on the troy scale which is 12 oz. to the pound.Feathers are weighed on the avoirdupois scale; 16 oz.to the pound. I used to win more beer with that one.***sigh*** troll |
Subject: RE: BS: Mathematical Probability Query From: Jon Freeman Date: 10 Oct 00 - 10:35 PM Murray, I have never been that good and probability, as with much of maths, has never been something that I am good at but I can relate to what you are saying. I had a Commadore 64 and used to enjoy the occasional challenge.. As for the newer computers and programming, I still have an old dos version of Turbo Pascal which I still like and use ocassionally. Number crunching wise, I have a Fortran Compiler which I believe is good but I haven't got round to learning the language and I probably would be incapable of taking advantage of its power as my maths is not good enough. Jon |
Subject: RE: BS: Mathematical Probability Query From: Peter T. Date: 10 Oct 00 - 10:35 PM True story: A student came into one of my classes this morning and said, Professor, the people in my house are complaining about you. And I said, oh why? and he said, you suggested that if we wanted to study exponential growth outside of a petri dish we should leave a piece of pizza lying around outside the fridge, and see what happens. I left it in my room for three days so no one would eat it before it started going bad, and then left it on the kitchen windowsill. I have been doing that for over a week, and when they come in and see it, I say that I am studying exponential growth, and blame you for how disgusting it looks. I confess to having been somewhat thrilled. Simple pleasures. yours, Peter T. |
Subject: RE: BS: Mathematical Probability Query From: Mrrzy Date: 10 Oct 00 - 10:41 PM And about the other one, about the bill, you have to subtract, not add. The bill was 30, less 5, is 25 that the restaurant charged, but it took 27, if you count the unscrupulous waiter's 2. The guests each paid 9, which matches the 27 the restaurant, and its personnel, took. |
Subject: RE: BS: Mathematical Probability Query From: Amos Date: 10 Oct 00 - 10:41 PM Fie on obfuscatory semantics, quotha! Th'art a knave an' thy trickster's tongue is quick with canards and darts of shrill vexation. Get the to a cobbler, an thee be not a heel, he mought mend thy sole, that thou hold thy tongue.... A |
Subject: RE: BS: Mathematical Probability Query From: Mary in Kentucky Date: 10 Oct 00 - 10:43 PM So Marion, what is the correct answer? I'm inclined to say don't change your guess. Even though your original odds have changed, your remaining choices still have the same odds. This reminds me of the coin toss question. If you're tossing a coin, and you get 17 heads in a row, what are the chances that the next toss will be tails? |
Subject: RE: BS: Mathematical Probability Query From: Marion Date: 10 Oct 00 - 11:13 PM Amos, you're right, and I'll try to explain below. Catspaw, sorry, but I don't understand you. Who or what is Monty? Jon and Jeri, you've given the most intuitive answer - that at the time of your second choice there are only two cups in the running, and you don't know which, so you think it's a 50-50 thing. But this is not so. In fact your best bet is to switch cups. If you switch, you have a 2/3 chance of winning, but if you don't switch, it's only 1/3. I know this is really counterintuitive, so I'll try two different ways of explaining it. One: remember that if you guessed right the first time, switching will definitely make you lose. If you guessed wrong the first time, switching will definitely make you win, since the other wrong cup has been eliminated for you. The probability of guessing right the first time is 1/3, so the probability that switching will make you lose is also 1/3. The probability of guessing wrong the first time is 2/3, so the probability that switching will make you win is also 2/3. Two: think of it this way: when you make your first guess and point out a cup, what you are really doing is dividing the cups into two groups: a small group with one cup, and a big group with two cups. When you are given a second chance to guess, what you are really doing is saying whether you think it's more likely that the prize will be in the big group or the small group. It's more reasonable to bet that the prize will be somewhere in the big group. When the house lifts an empty cup, that just shows that at least one of the big group is empty, but you knew that already, so that's not really relevant to the question of whether the big group or the small group is a better bet (although it is useful information because it tells which member of the big group would have the prize if one of them does). There's a 2/3 chance that the prize is in the big group, so your chances are better if you switch over to the big group. I know this sounds terribly convoluted compared to the beautiful simplicity of "two cups, the one you picked and one not touched yet, so 50-50 chances", so if you're still not convinced, try this mental experiment: Imagine you sat down with a ridiculously patient friend to play this game 1000 times. The plan is for your friend to switch every time, then you'll see if the number of times he wins is closer to 500 or 667. Work through in your mind what would happen. He would pick an empty cup the first time approximately 667 times, assuming that his guessing and your prize-hiding were random. So he would win approximately 667 times. |
Subject: RE: BS: Mathematical Probability Query From: Marion Date: 10 Oct 00 - 11:33 PM Mary, there's always a 50% chance that the next coin toss will be tails. But I don't think that that puzzle has much bearing on my cups game. Each coin toss in the series is an independent event. But in the cups game, your two chances to guess are related to each other; the chance that your second decision will be successful is directly dependent on the the success of your first guess. The cups game could be considered a series of independent random events if you forgot which cup you had chosen first and simply guessed randomly between the two cups that hadn't been lifted by the house. In that case, you would win about 50% of the time; it would be a question of choosing between two cups, not of choosing between sticking with a first guess that was probably wrong or abandoning a first guess that was probably wrong. But in 2/3 of the games you win, your winning cup would happen to be one that you didn't pick on your trial run guess. Here's another little puzzle, for free: Suppose you have a basket that can hold ten apples. You take out three. How many apples do you have? Marion
|
Subject: RE: BS: Mathematical Probability Query From: Mary in Kentucky Date: 10 Oct 00 - 11:39 PM You have three apples, but I'm still thinking about the cups. |
Subject: RE: BS: Mathematical Probability Query From: Jon Freeman Date: 10 Oct 00 - 11:40 PM 3 Jon |
Subject: RE: BS: Mathematical Probability Query From: GUEST,Murray MacLeod Date: 10 Oct 00 - 11:49 PM I am sorry Marion , but I cannot accept your reasoning on this, and wull have to ask you to refer me to an academic text which explains this and which agrees with your reasoning. I mean you didn't make this problem up yourself, did you? This problem seems to be related to one which was first aired in Scientific American in the 60's, and which stirred a fair amount of debate. There are three cards, one is red on both sides, one is white on both sides, and one is red on one side and white on the other. The dealer places them in a bag, shakes them then slides out one card onto the table so that only one face is visible. The face is red. What is the probability that the other face is red? Murray |
Subject: RE: BS: Mathematical Probability Query From: Mary in Kentucky Date: 11 Oct 00 - 12:01 AM Marion, give me some time to think about this one...in the meantime, does anyone remember the monk walking up the mountain puzzle which illustrates the difference between right brain and left brain thinking? I'll see if I can reconstuct that one...later... |
Subject: RE: BS: Mathematical Probability Query From: Jon Freeman Date: 11 Oct 00 - 12:03 AM |
Subject: RE: BS: Mathematical Probability Query From: GUEST,Jim Dixon Date: 11 Oct 00 - 12:12 AM I agree with Marion's answer. Here's a variant of the puzzle which might make it more clear: Instead of 3 cups, suppose there are 1,000 cups. You choose number 483 at random. Then the house turns over all the cups EXCEPT numbers 483 and 722, showing that all 998 cups are empty. Now, would you keep number 483? Or switch to number 722? |
Subject: RE: BS: Mathematical Probability Query From: Escamillo Date: 11 Oct 00 - 12:14 AM I'm so sorry! Please would you promise that my comments will be taken as another point of view, and that there is no intention to rain in anybody's parade ? That none will feel attacked in his/her beleifs and enjoyment ?? :)) Let's see the first problem. Dr.Math takes this FIRST PREMISE: "We'll start by figuring out the probability that two people have the same birthday. The first person can have any birthday. That gives him 365 possible birthdays out of 365 days, so the probability of the first person having the "right" birthday is 365/365, or 100%. The chance that the second person has the same birthday is 1/365. To find the probability that both people have this birthday, we have to multiply their separate probabilities. (365/365) * (1/365) = 1/365, or about 0.27%." This premise is absolutely false. He is taking for sure that the first person has a birthday that MATCHES everyone's birthday, so its probability is 1. In fact, the probability of any given couple of persons to have the same birthday is 1/365 multiplied by 1/365 and nothing else. Thus, the whole reasoning is false. Problem of the cups: sorry, the probability of winning when there are only two cups left, is 1/2, exactly 50%, no matter what have been the previous results.It is the same apparent paradox of the probability of 50% in a coin toss, after 6, or after 200 tails in a row: still 50%. A different thing is to calculate the probability of the whole sequence (not one toss): two tails in a row is 1/4, (will probably happen in 1 of every four sequences of two tosses), three in a raw is 1/2 * 1/2 * 1/2 = 1/8 = 12.5%,.. and so on. Interesting ! Un abrazo - Andrés (no Math degree..snif)
|
Subject: RE: BS: Mathematical Probability Query From: GUEST,Murray MacLeod Date: 11 Oct 00 - 12:25 AM Escamillo, I am sorry but your reasoning here is faulty. The question is, what is the probability that in a group of 23, AT LEAST two share the same birthday. Note that the question is not (or should not be) what is the probability that exactly two people share the same birthday? The only way to calculate this is to calculate the probability that everybody has a different birthday. Then you subtract that probability from 1 and you have the probability that evrybody in the group does NOT have a different birthday. That is another way of saying that at least two members of the group share ethe same birthday QED Murray |
Subject: RE: BS: Mathematical Probability Query From: GUEST,Murray Macleod Date: 11 Oct 00 - 12:30 AM Ah, yes, Marion, I see it now, Jim's contrubution made the penny drop. Good problem ! Murray |
Subject: RE: BS: Mathematical Probability Query From: Marion Date: 11 Oct 00 - 01:43 AM Mary: Damn it, I thought I might catch some people with the apples puzzle, because after the cups puzzle people wouldn't expect a simple solution... Murray: I heard the cups problem from my macroeconomics professor. I can't remember what connection he made between it and macroeconomics. The red/white thing does seem related at first blush, but I'd like to think about it some more before I comment. I'm intrigued that Jim's answer was so helpful to you; while it's nice to have someone agree with me, I couldn't understand how his explanation made it clearer. The thing that fascinates me about this problem is that the wrong answer is so simple and obvious whereas the right answer takes a lot of persuasion and there are a number of lengthy, counter-intuitive ways to explain. There's probably some lesson about life here. Andres: suppose you played this game several times, and suppose that you believed it didn't matter whether you kept your first guess or switched... in that case you would be picking randomly between the two cups, and you would win approximately 50% of the games. In this scenario, your second guess really would be an independent event with no connection to your first guess, so your comparison of this to one toss in a sequence of coin tosses is reasonable. However...as I said, by picking randomly between the two, you will win about 50% of the time. But if you look back at the games you win, you will find that in most cases your second guess was different from your first. And if you examine the games you lose, you will find that in most cases your second guess was the same as the first. Therefore, you can improve your chances of winning by changing your guess. If you don't believe me, you can confirm it with the formula for calculating the probability of a complex event that you mentioned in your post. If you write down every possible outcome in this game, then for each possible outcome multiply the probabilities of each independent event in the complex event.
For example: P(prize is in cup 1)=1/3 Then, if you add up the probabilities of the outcomes where the player changes guesses and wins the game, the total will be twice the total probability of the outcomes where the player guesses the same cup twice and wins the game. I've done this table and done the math (yes, I really am that obsessed) and I would say that this is irrefutable evidence according to the laws of finite math. Unfortunately I'm not an HTML expert so an attempt to produce a table here would be chaotic I'm sure, but do the table and the math yourself if you want. Thanks for playing, Marion
|
Subject: RE: BS: Mathematical Probability Query From: Escamillo Date: 11 Oct 00 - 04:27 AM I simply quoted Dr.Math's demonstration, which starts with this:"We'll start by figuring out the probability that two people have the same birthday." He states that probability is 1/365 and that is false. Why ? Given MY birthday, the probability of a coincidence with yours is indeed 1/365. But given NO date, the probability of we being born the same date is (1/365) * (1/365). It is the same as two persons taking two marble balls from different bags or the same bag in turns. ONCE I took mine, yor probability to take the same is 1/365. But given two persons and no ball in particular, their probability to take the same is 1/365/365. But his results are not very different from mine, because there are so many factors involved. Let's try a calculation by ourselves. This simple Turbo Pascal program will give interesting results, starting from the probability of a NO MATCH and subtracting from 1:
Program test ;
end.
These are some curious results: I think that the apparent unintuitive result (only 22 persons for a 50% probability , 70 persons for nearly a certainty !), appears because it is difficult to visualize the enormous probabilities for many matches. In fact, the probability of just two matches is almost as low as no match at all.
|
Subject: RE: BS: Mathematical Probability Query From: Mary in Kentucky Date: 11 Oct 00 - 07:11 AM I see the cup problem now. The word "history" helped me understand it. I think of it as 3 events. I choose, house chooses, then there is a 3rd event in which I choose again. My first choice (if it's empty) survived 1 in 3 probability, but the other cup (if it's empty survived 1 in 2 probability, thus a better choice. |
Subject: RE: BS: Mathematical Probability Query From: GUEST,Bob Schwarer Date: 11 Oct 00 - 07:14 AM If a coin toss comes up heads 17 times in a row it is probably a two-headed coin. Bob S. |
Subject: RE: BS: Mathematical Probability Query From: Escamillo Date: 11 Oct 00 - 07:32 AM Cups problem: I built this table and can't see anything wrong in the 50% chance:
PRIZE IS IN CUP 1
PRIZE IS IN CUP 2
PRIZE IS IN CUP 3
============================================
PRIZE IS IN CUP 2
PRIZE IS IN CUP 3 I guess (only guess) that the fact that an empty cup is removed does not influence the final selection which becomes a two-cup only game. It is the same as if the house said "Ok, we always take out one empty cup, let's play with only two cups, you make your choice, I ask you if you are sure, and you can keep your choice, or change it" The same would be (always guessing) with 1000 cups, I choose one, they take out 998 empty cups and let me choose between the remaining two. In my line of reasoning, the probability remains 50%. In yours, we would have an enormously high probability of a "bad" choice in the first selection, and then would more strongly recommend a change. Intriguing, but I don't see any mistake in the above tables. Un abrazo - Andrés |
Subject: RE: BS: Mathematical Probability Query From: Escamillo Date: 11 Oct 00 - 07:43 AM Cups problem : Mary, you say that the house CHOICES, that is, they take a chance ? They don't know where the prize is ? Hey! That would be a very different problem ! I was assuming that the house KNOWS it, and then will take out always the empty cup. Oh, I will have to stand up and think again, I was already sitted down ! Oy
|
Subject: RE: BS: Mathematical Probability Query From: Escamillo Date: 11 Oct 00 - 07:46 AM "the house choices" should be "the house chooses". Sorry for my English. |
Subject: RE: BS: Mathematical Probability Query From: Chris/Darwin Date: 11 Oct 00 - 08:32 AM There are always different ways of looking at these problems. Suppose you don't actually pick a particular cup, but visualise that there is a 1/3 chance of any cup being the one. When the house picks a cup there is a 1/3 chance they will pick the right one. There is therefore a 2/3 chance that the remaining two cups hold the prize - i.e., 1/3 each. Whether you actually pull one out initially or not is irrelevant - after all - all you are doing is putting your hand on it. The house still has exactly the same chance (1/3) no matter what cup they pick. Once the house eliminates a 1/3 possibility by finding an empty cup, there is 2/3 possibility remaining - and this is equally divided between the two cups. The chance of someone else having MY birthday is 1/365. The first person's birthday is always given. This is not the same as having two separate bags of 365 people, each with different birthdays, and separately drawing a person from each bag at random. The probability of both people drawn having a PARTICULAR birthday is 1/365 x 1/365 (in each case there is a 1/365 probability of the person having a particular birthday). However, if you now say that, whatever birthday is drawn from the first bag it is right, then the probability of matching improves to 1/365. This is because the probability of the first person having ANY birth date is 1. Chris |
Subject: RE: BS: Mathematical Probability Query From: AndyG Date: 11 Oct 00 - 08:36 AM Escamillo,
Your table analyses all the possible events and shows a 50/50 chance of success.
However not all these events occur in the problem as stated.
sequence:
Ammended Table:
PRIZE IS IN CUP 2
PRIZE IS IN CUP 3
PRIZE IS IN CUP 1
PRIZE IS IN CUP 2
PRIZE IS IN CUP 3
Overall 1/2, but if you always change the probability is 2/3.
* It doesn't matter which cup the house lifts, if you change your choice you lose.
That's my reading of it anyway :)
AndyG |
Subject: RE: BS: Mathematical Probability Query From: Mary in Kentucky Date: 11 Oct 00 - 08:37 AM I think I mispoke on the words I used above when I said "empty." I'm still thinking about this one, but since I'm at work and should not be thinking... |
Subject: RE: BS: Mathematical Probability Query From: Jeri Date: 11 Oct 00 - 09:27 AM Well, I'll admit I'm mathamatically challenged. I've no problem with logic, just formulas, so I'm going to be difficult. When you choose the first cup, you have a one in three chance of picking the right cup. When you choose a second time, you have a one in two, or 50% chance of being right. Forget the third cup - you're making a choice between 2 cups, and it makes no difference if you pick the same cup again - it's still a choice. If it is the one with the prize, it will have survived a 1/2 chance because you made a second choice. I don't think adding up the probablilities makes sense. I think that taking one of the cups away just changes the number to pick from. You can't have a 2/3 chance of winning if there are only 2 cups. I wonder if anyone's done experiments... |
Subject: RE: BS: Mathematical Probability Query From: CamiSu Date: 11 Oct 00 - 09:36 AM Marion, if you have a basket with 10 apples in it and you take out 3, you have 10 apples. 7 in the basket and 3 in your hand! It's all in how you state the problem. Someday I must take a statistics course. I've tutored 3 adults through increasingly difficult ones and had so much fun that I want to do it for myself! |
Subject: RE: BS: Mathematical Probability Query From: GUEST,Pete Peterson Date: 11 Oct 00 - 10:04 AM first, thanks to Marion and also to Jim Dixon; between the two of you I finally have an explanation that makes sense to me. It's called the Monty Hall problem 9hence Catspaw's reference) because it is identical with the procedure on "let's make a deal"-- M. Hall, host. If you go into Yahoo/Science/Mathematics you can get quite a discussion of this problem, under that name, but none of the explanations are as clear as the two I have seen! thanks. |
Subject: RE: BS: Mathematical Probability Query From: Jeri Date: 11 Oct 00 - 10:24 AM Slowly, the clouds part, and a ray of light penetrates the dank, dark place that is my brain. Thinking about what Jim has said, I see that the the probabilites are affected by the "house-knows-something-you-don't" factor. I had a pretty slim chance of being right when I picked at random from 1,000 cups. The house selected the other cup to not turn over, and the house KNOWS which cup hides the prize. Since my chances of being right in the first place were not so good, I switch to the house's pick. I don't know if this words as well with only three cups to start with - my chances of being right initially weren't so bad, but at least I've got the idea now. |
Subject: RE: BS: Mathematical Probability Query From: Mary in Kentucky Date: 11 Oct 00 - 10:50 AM hanks for the links Pete. Here's The Monty Hall Problem. And I liked this link (referred to in the above link) Obstinacy, Comprehension, and the Monty Hall Problem. This could be applied to our thread about the unbelieveable, but I won't go there. |
Subject: RE: BS: Mathematical Probability Query From: Mary in Kentucky Date: 11 Oct 00 - 10:55 AM Always profreed... Thanks for the links Pete. Here's The Monty Hall Problem. And I liked this link (referred to in the above link)---Obstinacy, Comprehension, and the Monty Hall Problem. This could be applied to our thread about the unbelieveable, but I won't go there. |
Subject: RE: BS: Mathematical Probability Query From: Jon Freeman Date: 11 Oct 00 - 11:18 AM I must admit I am getting more and more confused with the cups one. I thought I could see and had modeled an answer and then tried to go through every possibility I can see occurring. Which led me to this:
I have only considered the stick here but the change would be the opposite. This yeilds a 6/12 result - what moves have I missed? Jon
|
Subject: RE: BS: Mathematical Probability Query From: Jim the Bart Date: 11 Oct 00 - 11:29 AM Sorry I doubted you, Jon. I yield the floor to the higher minded among us and will be quiet now. |
Subject: RE: BS: Mathematical Probability Query From: AndyG Date: 11 Oct 00 - 11:58 AM Jon,
The sum of the probabilities must be 1.
The probability that your first choice is the winner is 1/3.
AndyG |
Subject: RE: BS: Mathematical Probability Query From: Jon Freeman Date: 11 Oct 00 - 12:06 PM Andy, that was what I had thought until I tried to model every possibility on a spread sheet. It seems to me that 1/3 of the moves being considered are invalid as they would involve the house uncovering the winning cup. Bartholomew, I may have got that one right but I'm not higher minded - I am very confused here. Jon |
Subject: RE: BS: Mathematical Probability Query From: Mark Clark Date: 11 Oct 00 - 12:13 PM Boy, I can see one has to hang in here a lot more than I can if one is to keep up. I see my race car puzzle was way to easy for this group. I remember Marilyn Vos Savant discussing the Monty Hall (cups) problem in the Parade Sunday roto section. A search for Marilyn turned up another good explanation, for the terminally curious, of why one should switch cups after one has been turned over.
Marion, I think Spaw was referring to three card monty, a common street shell game in large urban areas.
- Mark |
Subject: RE: BS: Mathematical Probability Query From: AndyG Date: 11 Oct 00 - 12:23 PM Jon,
As I said earlier in the ammended table, you, like Escamillo are modelling events that don't happen.
PRIZE IS IN CUP 1
PRIZE IS IN CUP 2
PRIZE IS IN CUP 3
(again only for sticking as in your example)
AndyG |
Subject: RE: BS: Mathematical Probability Query From: Jon Freeman Date: 11 Oct 00 - 12:26 PM Yep OK, got it now. I had previously modeled the result and was getting roughly 1/3 to 2/3 then tried this other one. The invalid 1/3 work against the house. Jon |
Subject: RE: BS: Mathematical Probability Query From: Jon Freeman Date: 11 Oct 00 - 12:34 PM Andy, my model only had the house lifting once and are events that actually happen. What it failed to take into account was that in several situations, the house was forced into picking one of the 2 remaining cups and although invalid, they still need to be taken into account for the oveall probability. Jon
|
Subject: RE: BS: Mathematical Probability Query From: Wolfgang Date: 11 Oct 00 - 12:38 PM First, for those who don't believe (yet), test it yourself. It won't take too many trials to convince you that switching is the correct strategy. Jeri, however, has pointed to the correct detail. It matters if the one who opens the first cup, knows (and cares in opening) whether this cup contains the prize or not. This detail was not completely clear in Marion's first post. If just any cup is opened by chance and either you have lost at once (for this cup contains the prize) or you have the choice between the two remaining cups (in the other two third of the cases) then it does not matter at all whether you change your first choice or not. Like in a variant of that problem, when a father of two comes up to you and says: 'I've two children, one of them is a boy' and you are asked what is the probability that the other child is a boy too. Depending upon conditions I have deliberately not mentioned the correct response is either 1/2 or 1/3. Yes, there are experiments done on the Monty Hall dilemma. I've done one, for instance. In the classical situation Marion has described, only about 8 % of our subjects switched (made the experiment cheaper, for there was a real prize under there to take home). If we had 30 cups, however (28 with no prize opened after the first choice), about 2/3 of our subjects switched which shows that they were somehow sensible to the odds (in this case about 97 % probability for winning, if you switch). The best recent article on these problems is by R. S. Nickerson in the Psychological Bulletin (circa 1997/98). Sorry, I don't find the exact reference right now. If you've understood the problem so far, you're ready for the Russian Roulette variant (only one miss, all other options are prizes): 3 cups with two prizes and one miss. After your first choice an other cup that is known to contain a prize is opened, leaving you with two cups, one containing a prize. Should you switch??? Or does it not matter? Of course it matters, you should not switch under these conditions, for you have a 2/3 probability of winning if you stay. It is an open question for science (yes, we do have open questions, lots of), why in this situation when switching is bad many more subjects switch than in the standard version in which switching is good. Well, people do have many difficulties with conditioned probabilities (even math profs and PhD's are on the record for defending a wrong solution for Monty Hall's dilemma with very strong words) and if you give me a new problem of that kind and ask me for my spontaneous solution (without using paper, pencil and Bayes' theorem) I'd not be surprised to be wrong. Wolfgang |
Subject: RE: BS: Mathematical Probability Query From: GUEST,Marion Date: 11 Oct 00 - 01:59 PM Wow, I'll have to come back here when I have more time and study the new responses to the cups problem, but let me just say this for now: Mary, you're not suppsed to think while you're at work? What do you do exactly? :) Andres, you were right in the beginning that the house does know where the prize is, and will always uncover an empty cup. And I've thought about the red/white problem, which I'll copy here since it's way up there: "This problem seems to be related to one which was first aired in Scientific American in the 60's, and which stirred a fair amount of debate. There are three cards, one is red on both sides, one is white on both sides, and one is red on one side and white on the other. The dealer places them in a bag, shakes them then slides out one card onto the table so that only one face is visible. The face is red. What is the probability that the other face is red? " Here's my answer: the probability is 2/3 that the side you can't see is red. A little counterintuitive, but simpler (I hope!) to explain than the cups problem. I think the key is to look not at the number of cards but at the number of sides. There are three red sides. Of those three, two have red on their opposites, and one has white on its opposite. Are you still around, Murray? Is this what the Scientific American said? Marion |
Subject: RE: BS: Mathematical Probability Query From: Jim Dixon Date: 11 Oct 00 - 02:02 PM The following is not so much a mathematical problem as a psychological problem, but it has to do with the subjective evaluation of outcomes. PROBLEM 1: You start out with $60 in your wallet. You spend $20 on an advance general-admission ticket to a concert. You arrive at the concert hall, open your wallet, and find that the ticket is missing and presumably lost forever, but you still have $40. What would you do? Would you spend another $20 to buy another ticket? Or would you skip the concert? (There is no right or wrong answer here, but please consider your answer before you read the next paragraph.) PROBLEM 2: You start out with $60 in your wallet. You head for the concert hall, planning to buy a ticket at the door. When you arrive, you open your wallet and find that $20 is missing and presumably lost forever, but you have $40 left. Now what would you do? Someone did an actual survey. I don't remember the numerical results, but a large number of people - let's say half - when problem 1 was posed, said that they would skip the concert. When problem 2 was posed instead, nearly everyone said they would buy a ticket. I will leave it to you to figure out what this says about human nature. |
Subject: RE: BS: Mathematical Probability Query From: Mary in Kentucky Date: 11 Oct 00 - 02:54 PM Marion - I do absolutely as little as possible because I WORK FOR MY HUSBAND! I try to get fired everyday, but so far can only manage it at about one minute before closing time. I help in the vet clinic, and yesterday was such a hard day I need to take a nap now on my afternoon off. |
Subject: RE: BS: Mathematical Probability Query From: Penny S. Date: 11 Oct 00 - 03:50 PM It seems to me that if you have a basket which can hold 10 apples, there is no evidence that you have any apples to take. Penny |
Subject: RE: BS: Mathematical Probability Query From: GUEST,Murray MacLeod Date: 11 Oct 00 - 06:12 PM Narion, you are of course perfectly correct regarding the red/white problem. Martin Gardner , who writes (wrote?) for !Scientific American" published this problem in the 60's and was immediately heckled by a so-called professional gambler who insisted that the probability of red was evens,("only two possibilities , either red or white"). He even offered to meet Gardner and play for real money. Gardner refused, reasoning that his available capital was so much less than the gamblers capital that he could not afford the possibility of a freak run which as all statisticians know is certain to occur sooner or later. Going back to the cups problem, my indebtedness to Jim comes by my extrapolating the problem from 1000 cups ( as he postulated) all the way down to four. In each case it is obvious that , when the house knows the correct cup, it makes sense to switch. I have to say that I still do not see it intuitively at the three-cup level, but I am familiar with mathematical patterns and I know your solution is correct. I would just like to add that this is one of the most enjoyable threads i have ever encountered on the net.
Thank you FMaj7 for instigating it and thank you Marion for enlivening it. And thank you to all who still dont understand the correct solution for inducing a warm glow of smug self-satisfaction in those who do
Murray |
Subject: RE: BS: Mathematical Probability Query From: Jon Freeman Date: 11 Oct 00 - 07:07 PM This is what I get with the cards.
|
Card | Red Shows | Red/ Red | ||
1 | W/W | FALSE | 0 | 0 |
2 | R/W | TRUE | 1 | 0 |
3 | R/R | TRUE | 1 | 1 |
4 | W/W | FALSE | 0 | 0 |
5 | W/R | FALSE | 0 | 0 |
6 | R/R | TRUE | 1 | 1 |
3 | 2 | |||
%Win | 66.66667 |
Maybe one day I'll learn the maths...
Jon
Post - Top - Home - Printer Friendly - Translate
Subject: RE: BS: Mathematical Probability Query From: GUEST,Murray macleod Date: 11 Oct 00 - 09:13 PM yes Jon, correct answer, faultless reasoning, but on the street corner you gotta think on your feet ..... Murray |
Subject: RE: BS: Mathematical Probability Query From: Jon Freeman Date: 11 Oct 00 - 09:20 PM I'm loving this Murray, even though I get things wrong and confuse myself.... Anyone got any more? Jon |
Subject: RE: BS: Mathematical Probability Query From: Mary in Kentucky Date: 11 Oct 00 - 10:55 PM OK Jon, it's late but I'll try to reconstruct this one. A monk started walking uphill on a winding mountain path at sunrise. It took him three days (with lunch breaks and rest stops) to reach the summit. He then began the journey down the mountain, same path, left at sunrise, no trick that I can think of, and it took him two days to complete the downhill journey. Question: Is there any point on the path where he is at the same time of day? |
Subject: RE: BS: Mathematical Probability Query From: GUEST,Murray MacLeod Date: 11 Oct 00 - 11:01 PM Mary, it is late at night and you haven't phrased the question too well. Too many Mint Juleps, I'll be bound (isnt that what Kentucky ladies drink ?) You have to be specific about the monk's departure time on each occasion. "Sunrise" isn't enough, otherwise some smatass mudcatter is going to pull you to pieces. Murray |
Subject: RE: BS: Mathematical Probability Query From: Mary in Kentucky Date: 11 Oct 00 - 11:47 PM OK, SA, no trick here, let's say he begins each journey at precisely 7 AM. Have I ever posted the recipe for a mint julep? I'll look for it. Those from Kentucky have probably heard it. |
Subject: RE: BS: Mathematical Probability Query From: Marion Date: 12 Oct 00 - 12:31 AM Murray, re: "And thank you to all who still dont understand the correct solution for inducing a warm glow of smug self-satisfaction in those who do." I am much amused, thank you. It seems like a number of converts have come over to the light side here... I'm curious to know if anyone, ever, sees the right answer right away. I certainly didn't. Jon and Andres, thank you for making tables. You both made the same mistake; you just COUNTED the possibilities when you have to WEIGH them. Remember that having a certain number of possibilities doesn't necessarily mean that each of those possibilities is equally probable; in this case, they are not. I will demonstrate this by editing Andres' table so that the probability of each outcome is weighed (Jon, I don't dare mess with your table). You remember, of course, that the probability of a complex outcome is the product of the probabilities of the events that compose it. I will assume that the player is some silly person who believes that it doesn't matter if he switches or not, so the player's second choice is random. And I will code each outcome: A means a switch resulting in a win, B means a switch resulting in a loss, C means a stay resulting in a win, and D means a stay resulting in a loss.
PRIZE IS IN CUP 1 (1/3)
PRIZE IS IN CUP 3(1/3)
============================================
PRIZE IS IN CUP 1(1/3)
PRIZE IS IN CUP 2(1/3)
PRIZE IS IN CUP 3(1/3) (This is the same in essence as Andy's fixing of the table, but with everything spelled out.) OK, so if you add up the probabilities of the all the A scenarios, and all the B scenarios, and so on, you get: Probability that player will switch and win by doing so: 6/18 or 33% Probability that player will switch and lose by doing so: 6/36 or 17% Probability that player will stay and win by doing so: 6/18 or 33% Probability that player will stay and lose by doing so: 6/36 or 17%
I think I can now say, with deep sincerity: QED. I have proved my case. Marion, secretly hoping there'll be more holdouts so I'll have to learn another way to explain it |
Subject: RE: BS: Mathematical Probability Query From: Marion Date: 12 Oct 00 - 01:00 AM Wolfgang, you've done real life experiments on this? Do you mind if I ask what your profession is? I'm not sure of this statement of yours: "If just any cup is opened by chance either you have lost at once (for this cup contains the prize) or you have the choice between the two remaining cups (in the other two third of the cases) then it does not matter at all whether you change your first choice or not." If the dealer didn't know where the prize was and showed you one cup randomly, then sometimes he would show you the one with the prize so the game would be over for you. But if he shows you an empty cup, I still think it would be better to switch. Why would the dealer's knowledge or ignorance affect the game from your end of things? It's still true that the prize is more likely to be among the two cups that you didn't pick than in your first choice, and the dealer is still (although unintentionally) showing you which of those two has the prize if either of them does. Marion |
Subject: RE: BS: Mathematical Probability Query From: Marion Date: 12 Oct 00 - 01:12 AM Just to throw in a little twist, here's an open question for anybody who agreed with me from the beginning on the cups problem, or was convinced in this thread: Imagine that you and two other people are thrown in jail and told that one of you (and that one has already been decided by the jailer) will be executed and the other two released. You have no basis for guessing which one of you is doomed, so as far as any of you can tell, it's a random decision. The next day, one of the other prisoners is released. Assuming that you only care about yourself, are you relieved, worried, or indifferent about the other prisoner's release?
|
Subject: RE: BS: Mathematical Probability Query From: Crazy Eddie Date: 12 Oct 00 - 04:14 AM Marion, (are you also "Guest Marion?") (1)Re the Cards, Guest Marion said: << There are three red sides. Of those three, two have red on their opposites, and one has white on its opposite. >> There are TWO cards which can have Red as the top surface. Of these, ONE has white on the bottom, ONE has Red on the botton.If you can see red side, on top, you can eliminate the W/W card. i.e. you KNOW this is NOT the W/W card. Therefore it must be EITHER the R/R card, or the R/W card. 50/50 chance. If it is the R/R card, then the side you cannot see is RED, If it is the R/W card, then the side you cannot see is WHITE. So again, 50/50
|
Subject: RE: BS: Mathematical Probability Query From: Crazy Eddie Date: 12 Oct 00 - 04:37 AM Marion, I am glad you put in the prisoner thing. I figured there was no advantage to switching, but some of the switch theories were a bit confusing. The prisoners thing makes it clear.
BTW the following is not a joke, or an ethnic slur. Just an easy way to for me to keep track.
[1] There are three of us in jail. Englishman, Scotsman, Irishman (me).
No problems with the logic so far.
[7] If I'M better off to switch, then clearly HE'S also better off (because from HIS point of view HE is the one who is switching.)
[8]How can BOTH of us improve our odds? If my odds improve, his must disimprove, since 1 of the 2 of us must hang?
Since [7] & [8] are mutually exclusive, switching cannot improve the odds for either of us. Therefore to switch or not to switch makes no difference! Q. E. D. |
Subject: RE: BS: Mathematical Probability Query From: Wolfgang Date: 12 Oct 00 - 05:01 AM Marion, I don't mind you asking. I'm professor of psychology and the work I mentioned was a recent diploma thesis by one of my students. She explicitely told the subjects that she knew where the prize was and would use this knowledge to make sure that she only opened cups with no prize under them. If she hadn't told this before the experiment the subjects might have been right assuming that they could as well stick to their first choice. Why?
I try to make it understandable without mathematics (but you could calculate it by using the tables of events as has been done above). Often it works if you make the case more salient by using more extrem numbers. Let's take Jim Dixon's example from above: The very same problem arises in your prisoners example (the most often and best studied problem in conditioned probabilities). You have not told us enough to make the response unambiguous. First, the decision which prisoner has to die has not just to be random but also random with equal probabilities (how goes random with unequal probabilities? E.g., you'd have an urn with 100 red beads (you die), 700 black beads (coprisoner B dies) and 200 green beads (coprisoner C dies)). The two responses below are only correct under that assumption, otherwise it is much more tricky. But that's only a minor quibble. The major is this. You (as prisoner) have to know beforehand that the next day definitely not you but another prisoner will be released to come to the response you (Marion) think of, namely, you should not be worried. If, however, you know that the next day one of the three prisoners (not necessarily you, but possibly you) will be released and the one released is not you, you should be worried for now the probability of you dying goes up from 1/3 to 1/2. Wolfgang |
Subject: RE: BS: Mathematical Probability Query From: Wolfgang Date: 12 Oct 00 - 05:12 AM A new problem known in statistics as Simpson's paradox: Imagine you get a prize for drawing a black bead from an urn (without looking) and you always get the choice of two urns to draw from. In the first choice you have urn A with a single black bead (and no other bead) and urn B with 2 black beads and 1 white bead. You'd not hesitate long to see that urn A is the better choice for you. In the second choice you have urn A' with 4 black beads and 6 white beads and urn B' with 1 black and 2 white beads. You take a little longer to find that 4/10 is larger than 1/3 but again you'd opt for urn A'. Now the total contents of the A urns (without missing a bead) are put together in one urn A'' and the total contents of urns B are put together in urn B''. Which urn do you prefer now? Wolfgang |
Subject: RE: BS: Mathematical Probability Query From: Escamillo Date: 12 Oct 00 - 06:18 AM The extrapolation to 1000 cups helped me too. The event is divided in two stages. In the first, you have a probability of 1/1000 to choose the prized cup. The house is OBLIGATED to take 998 cups out and leave to you the alternative: "Did you win in the first stage, with 99.9 % probabilities against you, AND will win AGAIN with 50% against you, in two consecutive choices ? Or did you loose first, with 99.9% probability of loosing, and will now win with this 50%, in two consecutive choices ?" Undoubtedly, and very intuitively, I would change ! Moreover, I imagined a situation in which the house is allowed to change places of the prize before you make your second choice. This would become TWO games, one with 33.3% probabilities of winning, and another one with 50%. Since the house cannot change the place of the prize, and always disclose all remaining cups minus one, it is giving you a very valuable information that changes your probability to win in the second stage. You can be sure that if I open a casino, I'll hide out the three cups for a moment before you attempt your second choice. Un abrazo - Andrés (and thanks to all for taking the trouble to explain this so well )
|
Subject: RE: BS: Mathematical Probability Query From: Crazy Eddie Date: 12 Oct 00 - 06:52 AM Mary In Kentucky You said <.
Good one. To begin with, assume no rest breaks & constant speed for each journey (simplified version) Draw a line segment on a page, label one end A the other B [A is the botton of the hill B the top] Above the line,mark the monks position at zero point, Day one, 12 hours Day one 24 day one, etc. using 24 hour clock. BELOW The line do the same for the return journey. As you can see, the times increase Left to Right above the line, and Right to left. Times actually cross at two points: With rests, sleep etc, we cannot calculate exactly WHEN he will cross, (Lets take a ridiculously extreme example, where the hill is really short. Up hill he walks five minutes, rests almost 3 days, walks five minutes end! Return journey he rests for one day less!). However, if you draw the diagram for the simplified version above, it is clear that he MUST cross. Normally, he must cross TWICE, however, you can bring it down to crossing only once, if he stays in one place for a full day. |
Subject: RE: BS: Mathematical Probability Query From: AndyG Date: 12 Oct 00 - 06:53 AM Marion, If the house move is random then;
1) Player selects a cup. (sticks with choice, 1/3 chance to win)
If case three arises, and the player always switches, his chance to win is 2/3, but only in 2/3 of all the games played. (The house wins 1/3 of the games when it selects at (2).)
I hope that;s right ;)
Wolfgang,
I hope that's right too ;)
AndyG |
Subject: RE: BS: Mathematical Probability Query From: GUEST,Muray MacLeod Date: 12 Oct 00 - 07:09 AM OK Wolfgang, after the cups I am prepared to be wrong again, but it looks to me like urn A now has a 5/6 probability of black, and urn B has 3/4 probability of black. So urn A is still the preferred option. So where is the paradox? Sounds more like Homer Simpson's paradox to me. Murray |
Subject: RE: BS: Mathematical Probability Query From: GUEST Date: 12 Oct 00 - 08:00 AM ..and sometimes 2 + 2 = 5, for extremely large values of 2. |
Subject: RE: BS: Mathematical Probability Query From: Wolfgang Date: 12 Oct 00 - 08:34 AM B'', otherwise it would be no paradox. The odds(!) are 5/6 in A'' and 3/3 in B'', the probabilities are 5/11 and 3/6. This paradox has the awkward consequence that you can have four studies with the results pointing unanimously in one direction and when you pool the results to get a clearer picture the combined results point in the opposite direction. Scientists just hate such a situation.
To use a real life example: When you are a parent of now adults kids or just watch the young generation growing you are not surprised to learn that the average size (height) of male German students has increased since the 1950s. Similarly, the average size of female students in Germany has increased during the same time. However, if you pool the results to get the average size of all students in Germany, both male and female together, you'll find that the average size has decreased. The problem is most times not as easy to spot as in this example. There was even a court case in California in which a woman whose application to a Californian university was rejected sued for violation of equal opportunity (or whatever the term is) when she found out that the university did reject more females than males on a percentage base. She lost, because she fell prey to Simpson's paradox, when the university could show that (though the data pooled for the university showed a higher percentage of males being accepted) for each faculty separately there was even a small bias in favour women. Jurists must hate that: If they want to do justice to equal opportunity on the university level, they necessarily do injustice to males at the faculty level. If they want to do justice on a faculty level, they must do injustice to woman at the whole university level. And all that for purely mathematical reasons that do not bow to justice. Wolfgang |
Subject: RE: BS: Mathematical Probability Query From: Marion Date: 12 Oct 00 - 10:12 AM Ok, I'll have to think awhile about what it would mean if the house's choice is random. Thanks for the leads. Is anybody going to help me out and argue with Eddie over the red/white cards one? I think I still have to concentrate on my little cups. But about those prisoners... I'll put here my thinking on the problem, and why I think it doesn't contradict my stance on the cups, although at first analysis (that's you, Eddie!) it seems to. You might think that there is a 1/3 probability that you are doomed, and therefore a 2/3 probability that you aren't. Since the doomed person has already been chosen by the jailer, as I specified above, the release of a prisoner shouldn't change your chances, so the other prisoner is stuck with the 2/3 chance of being executed. The problem is, he's thinking the same about you. As Eddie says, there is a contradiction here. However, I would say that the real issue is in Eddie's 5th statement: "We each figure we have a 1/3 chance of being doomed." In fact, if the jailer has already decided who to execute, then each person's probability of being executed is either 1, or 0, not 1/3. Because the prisoners have no way of knowing whether they're a 1 or a 0, they perceive it as randomness, so they think their chances start at 1/3 then go to 1/2. But the execution will not be random, the victim has already been chosen. The probability that they perceive is different from the probability that is actually there. So psychologically, the two prisoners left are likely to be frightened by the third prisoner's release. But logically, if they know that the doomed one has already been chosen, they should consider it indifferent news. Marion
|
Subject: RE: BS: Mathematical Probability Query From: Mary in Kentucky Date: 12 Oct 00 - 10:23 AM Crazy Eddie - good job. This puzzle was designed to illustrate the differences in thinking between right brain and left brain processes. Using right brain thinking, you visualize a man starting at the bottom of the hill and another simultaneously starting at the top, and you quickly see that they must pass. A left brain thinker sometimes gets bogged down in verbal arguments, instantaneous snapshots, rates, etc. He cannot accept the simultaneous picture and has to somehow prove it with a linear prjection of the two trips. You got the correct answer with what appears to be a combination of both. I had never thought about "two" instances of being at the same place at the same time (or time of day). I probably was not clear in the question...I really couldn't remember the exact riddle. |
Subject: RE: BS: Mathematical Probability Query From: Wolfgang Date: 12 Oct 00 - 11:16 AM Crazy Eddie, here's what you have overlooked in your solution of the three cards problem: The RR card will always slide out red face up, the RW card will slide out red face up only in 50% of all cases. If you take that into account, you'll come to the 2/3 vs. 1/3 solution as well. Your 50/50 solution would only be correct, if the RW card is only shown red face up. Well, this hasn't been explicitely excluded in Murray's post, but my reading of this post tells me that it was fairly obvious that each card could land on each side. Marion, glad to help you here against Crazy Eddie, but I won't help you against his arguments in the prisoners problem. The way you have formulated it, Crazy Eddie is right. Wolfgang |
Subject: RE: BS: Mathematical Probability Query From: Bradypus Date: 12 Oct 00 - 05:10 PM The prisonner problem reminds me of one of my favourite stories / paradoxes:
A man is in prison on death row. The judge has told him that he will be executed before the end of the week, but won't tell him which day, only that it will come as a complete surprise to him. Bradypus
|
Subject: RE: BS: Mathematical Probability Query From: GUEST,Murray MaxLeod Date: 12 Oct 00 - 06:00 PM Bradypus, I love that one too it is a real paradox and MIND magazine had a lot of high-flying philosophers trying to solve it in the 7o's. I think the solution has somethhing to do with the invalidity of the initial premise, but never having completed my logic and metaphysics course, I probably wouldn't understand it anyway. Wolfgang, I am going to recheck the arithmetic on your one. And Marion, the time probably has come for a discussion of what exactly probability means. My take on it is that it is a function of the incompleteness of our information, and not really an attribute of the external world at all. IMHO it is actually incorreect to say "The probability of this coin landing heads is 1/2". The philosophically correct statement is "The probability of my being right if I call heads is 1/2" Murray |
Subject: RE: BS: Mathematical Probability Query From: Peter K (Fionn) Date: 12 Oct 00 - 06:23 PM Anyone who's got this far in the thread would appreciate, or will already have appreciated, the first few pages of Resencrantz and Gildernstern are Dead (as well as the rest of the play) by Tom Stoppard. Priceless. Marian, whether the house has prior knowledge about the cups is not a factor, except that if the prize cup is turned over, you lose your chance to reconsider. I know you didn't appreciate Jim's line on this, but it does help at the intuition level, so I'll use that to explain. Suppose the house invites you to pick the Ace of Spades out of a face-down pack of cards. You take a card but don't look at it. The house then turns over all but one of the remaining cards, without turning up the Ace of Spades. The house is not likely to achieve this feat without prior knowledge, but theoretically could. With or without prior knowledge, the house has made a dramatic intervention, which statistically you'd be daft to ignore. When you picked your card, there were fity-one to one chances that the house still had the Ace of Spades. Nothing's changed, except that those 51 to one chances are now vested in the one remained face-down card held by the house. The prisoner analogy has no bearing on the three-cup problem as no-one got to have a guess before the house intervened and changed the odds. |
Subject: RE: BS: Mathematical Probability Query From: GUEST,Murray MacLeod; Date: 12 Oct 00 - 06:42 PM Wolfgang, I see that I made not one but two extremely stupid arithmetical errors when attempting to calculate your paradox. My lame excuse is that it was 7.00 am, not my best time. I apologize for the Homer Simpson reference. DOH !! Murray |
Subject: RE: BS: Mathematical Probability Query From: GUEST,Murray MacLeod Date: 12 Oct 00 - 06:49 PM Fionn, my brain is starting to hurt again. My understanding of Marion's problem depends on the fact that the house DOES have prior knowledge, and is never going to turn over the cup containing the prize. AM I missing something ? Murray |
Subject: RE: BS: Mathematical Probability Query From: Wolfgang Date: 13 Oct 00 - 04:57 AM Murray, I didn't understand the Homer Simpson reference (lack of cultural background) so there's no need at all for an apology. For everybody else: Murray's dead right on Marion's problem and Fionn is wrong. If the house knows where the prize is and alwaysalways offers the claimant to switch then the response to the question what is the probability of winning if I switch is 2/3. If, however, the house always offers a switch and always turns over a cup which (a) was not chosen in the first choice but (b) may contain the prize (e.g., if the house doesn't know where the prize is or knows where the prize is, but nevertheless opens the prize cup in 1/3 of all cases) then the response to the same question is 1/2.
This puzzle is considered technically identical to a number of teasers in conditioned probability, including the prisoners' problem. All of these problems have in common that Other problems which are mathematically nearly identical are
the two dice problem: I throw two dice and tell you truthfully (after I had a look without you looking) "at least one of these two dice shows a six". What is the probability that the other also shows a six? Obvious response: 1/6, very counterintuitive response: 1/11 All these problems can get much more complicated if you allow for unequal prior probabilities (e.g., one prisoner gets executed with prob 1/2, the next with prob 1/3 and the last with prob 1/6) or if you allow for biases (e.g., the father of a mixed pair of kids usually tells about his son, but sometimes about his daughter).
If you are really interested I can only advise you to read : |
Subject: RE: BS: Mathematical Probability Query From: Wolfgang Date: 13 Oct 00 - 05:15 AM More than 100 posts!!
|