Well, look at how you had to arrive at this example. There had to have been an iteration with a traveller, and the example had to be adjusted to make it so that this traveller is an ideal donor for 10 people, none of whom is a good donor for remaining 9. We’re down to probabilities easily below 10^-10 meaning ‘not expected to ever have happened in the history of medicine’. (Whereas the number of worldwide cases when someone got killed for organs is easily in the tens thousands)
Human immune system does not work so conveniently for your argument, so you’ll have to drop the transplant example and come up with something else.
This should serve as quite effective demonstration of how extremely rare such circumstances are. So rare that you can not reason about them without aid of another person who strikes down your example repeatedly, forcing you to refine it. At same time, the cases whereby something like this is done for personal gain, and then are rationalized as selfless and altruistic—those are commonplace.
Privileging those exceedingly improbable situations to the same level of consideration as the much much more probable situations is a case of extreme bias.
The issue with rare situations is that the false positive rate can be dramatically larger than the rate of event actually happening, meaning that majority of detected events are false positives. When you carelessly increase the number of lives saved in your taking apart the traveller example, you are linearly increasing the gain but exponentially decreasing the probability of this bizzare histocompatibility coincidence.
If I heard that story from a doctor—I would think—what is the probability of this histocompatibility coincidence? Very very low. I am guessing below 1E-10 (likely well below). What is the probability that doctor is beginning to succumb to a mental disorder of some delusionary kind? Far larger, on order of 1/1000 to 1/10000 . Meaning that when you hear such story, there is still a very low probability still that it is true, and most likely explanation for the story is that doctor is simply nuts (and most likely he did just lie to you about the entire thing for sake of argument or something). Meaning that it would be (from utilitarian standpoint) more optimal to do nothing (based on belief that story was entirely made up) or call the police (based on belief that he did actually kill someone, or is planning to). [Of course I would try to estimate probabilities as reliably as I can before calling the police, to far greater degree of confidence than in an argument here.]
Basically, the thought experiments like that—usually they are just a way of forcing a person to make a mistake when reasoning non-verbally, in the hope that he wouldn’t be able to vocalize the mistake (or even realize he made one). In that case the mistake that the example tries to trick reader into is ignoring the error rate of the agent which is making the decision, in the circumstances where that rate is BY FAR (at least by 6 orders of magnitude I’d say, for the 10 patients) the dominating number in the utility equation.
Likewise the Chinese room tries to trick reader into making a 14 orders of magnitude error or so. Such mindbogglingly huge errors slip past reason. We are not accustomed to being this wrong.
Well, look at how you had to arrive at this example. There had to have been an iteration with a traveller, and the example had to be adjusted to make it so that this traveller is an ideal donor for 10 people, none of whom is a good donor for remaining 9. We’re down to probabilities easily below 10^-10 meaning ‘not expected to ever have happened in the history of medicine’. (Whereas the number of worldwide cases when someone got killed for organs is easily in the tens thousands) Human immune system does not work so conveniently for your argument, so you’ll have to drop the transplant example and come up with something else.
This should serve as quite effective demonstration of how extremely rare such circumstances are. So rare that you can not reason about them without aid of another person who strikes down your example repeatedly, forcing you to refine it. At same time, the cases whereby something like this is done for personal gain, and then are rationalized as selfless and altruistic—those are commonplace.
Privileging those exceedingly improbable situations to the same level of consideration as the much much more probable situations is a case of extreme bias.
The issue with rare situations is that the false positive rate can be dramatically larger than the rate of event actually happening, meaning that majority of detected events are false positives. When you carelessly increase the number of lives saved in your taking apart the traveller example, you are linearly increasing the gain but exponentially decreasing the probability of this bizzare histocompatibility coincidence.
If I heard that story from a doctor—I would think—what is the probability of this histocompatibility coincidence? Very very low. I am guessing below 1E-10 (likely well below). What is the probability that doctor is beginning to succumb to a mental disorder of some delusionary kind? Far larger, on order of 1/1000 to 1/10000 . Meaning that when you hear such story, there is still a very low probability still that it is true, and most likely explanation for the story is that doctor is simply nuts (and most likely he did just lie to you about the entire thing for sake of argument or something). Meaning that it would be (from utilitarian standpoint) more optimal to do nothing (based on belief that story was entirely made up) or call the police (based on belief that he did actually kill someone, or is planning to). [Of course I would try to estimate probabilities as reliably as I can before calling the police, to far greater degree of confidence than in an argument here.]
edit: a good example, the ambulance story here: http://lesswrong.com/lw/if/your_strength_as_a_rationalist/
Basically, the thought experiments like that—usually they are just a way of forcing a person to make a mistake when reasoning non-verbally, in the hope that he wouldn’t be able to vocalize the mistake (or even realize he made one). In that case the mistake that the example tries to trick reader into is ignoring the error rate of the agent which is making the decision, in the circumstances where that rate is BY FAR (at least by 6 orders of magnitude I’d say, for the 10 patients) the dominating number in the utility equation.
Likewise the Chinese room tries to trick reader into making a 14 orders of magnitude error or so. Such mindbogglingly huge errors slip past reason. We are not accustomed to being this wrong.