I should add that this is true about self-contradictory religions as well. For the probability that I mistakenly interpret the religion to be self-contradictory is greater than the probability that the chocolate cake is out there.
Unknown3
Steve, maybe this was your point anyway, but the incidents you mention indicate that the existence of flying saucer cults is evidence for the existence of aliens (namely by showing that the cults were based on seeing something in the real world.) No doubt they aren’t much evidence, especially given the prior improbability, but they are certainly evidence.
Caledonian, one reason to do that is that everyone has daft beliefs once in a while. It isn’t surprising that you ask the question, however, since you show no respect for those with whom you disagree on Overcoming Bias. Since you disagree with them, you presumably think that their beliefs are false, and consequently (according to your logic) that they themselves are unworthy of respect.
Caledonian, if you add the premise that some people should be respected, it is a logically valid and necessary conclusion (given that all people at some point have daft beliefs) that not all people who hold daft beliefs should be disrespected.
However, that is certainly not the best I can do. I could think of a long list of reasons for respecting such people: much the same reasons why you would do much better to show some respect for the authors and readers of Overcoming Bias. For one thing, you would have a much better chance of persuading them of your position, and if your position is true, then this is a very good effect. For another, everyone (even people holding daft beliefs) also knows some things that other people don’t know, so if your respect is based on knowledge (as you seem to imply) then everyone should be respected. One last obvious point: most people will not respect someone who does not respect them: this is why you are not respected by those in this forum. But perhaps you enjoy this anyway?
Caledonian, can you give an example of someone who has never held a daft belief?
Other than yourself, of course, since such a suggestion would seem to indicate bias. On the other hand, your disrespect towards all others with whom you disagree (which seems to be everyone on some topic or other) seems to suggest that you believe that they all hold daft beliefs.
Caledonian, of course that cannot be demonstrated. But who needs a demonstration? Larry D’anna said, “A googolplex of dusty eyes has the same tiny negative utility as one dusty eye as far as I’m concerned.” If this is the case, does a billion deaths have the same negative utility as one death?
To put it another way, everyone knows that harms are additive.
Ben, I think you might not have understood what I was saying about the poll. My point was that each individual is simply saying that he does not have a problem with suffering a dust speck to save someone from torture. But the issue isn’t whether one individual should suffer a dust speck to save someone, but whether the whole group should suffer dust specks for this purpose. And it isn’t true that the whole group thinks that the whole group should suffer dust specks for this purpose. If it were, there wouldn’t be any disagreement about this question, since I and others arguing this position would presumably be among the group. In other words, your argument from hypothetical authority fails because human opinions are not in fact that consistent.
Suppose that 1 person per google (out of the 3^^^3 persons) is threatened with 50 years of torture? Should each member of the group accept a dust speck for each person threatened with torture, therefore burying the whole group in dust?
I agree that as you defined the problems, both have problems. But I don’t agree that the problems are equal, for the reason stated earlier. Suppose someone says that the boundary is that 1,526,216,123,000,252 dust specks is exactly equal to 50 years of torture (in fact, it’s likely to be some relatively low number like this rather than anything like a googleplex.) It is true that proving this would be a problem. But it is no particular problem that 1,526,216,123,000,251 dust specks would be preferable to the torture, while the torture would be preferable to 1,526,123,000,253 dust specks would be worse than the torture: the point is that the torture would differ from each of these values by an extremely tiny amount.
But suppose someone defines a qualitative boundary: 1,525,123 degrees of pain (given some sort of measure) has an intrinsically worse quality from 1,525,122 degrees, such that no amount of the latter can ever add up to the former. It seems to me that there is a problem which doesn’t exist in the other case, namely that for a trillion people to suffer pain of 1,525,122 degrees for a trillion years is said to be preferable to one person suffering pain of 1,525,123 degrees for one year.
In other words: both positions have difficult to find boundaries, but one directly contradicts intuition in a way the other does not.
Ben and Mitchell: the problem is that “meaningless inconvenience” and “agony” do not seem to have a common boundary. But this is only because there could be many transitional stages such as “fairly inconvenient” and “seriously inconvenient,” and so on. But sooner or later, you must come to stages which have a common boundary. Then the problem I mentioned will arise: in order to maintain your position, you will be forced to maintain that pain of a certain degree, suffered by any number of people and for any length of time, is worse than a very slightly greater pain suffered by a single person for a very short time. This may not be logically incoherent but at least it is not very reasonable.
I say “a very slightly greater pain” because it is indeed evident that we experience pain as something like a continuum, where it is always possible for it to slowly increase or decrease. Even though it is possible for it to increase or decrease by a large amount suddenly, there is no necessity for this to happen.
The fact that Eliezer has changed his mind several times on Overcoming Bias is evidence that he expends some resources overcoming bias; if he didn’t, we would expect exactly what you say. It is true that he hasn’t changed his mind often, so this fact (at least by itself) is not evidence that he expends many resources in this way.
Eliezer’s question for Paul is not particularly subtle, so I presume he won’t mind if I give away where it is leading. If Paul says yes, there is some number of dust specks which add up to a toe stubbing, then Eliezer can ask if there is some number of toe stubbings that add up to a nipple piercing. If he says yes to this, he will ultimately have to admit that there is some number of dust specks which add up to 50 years of torture.
Rather than actually going down this road, however, perhaps it would be as well if those who wish to say that the dust specks are always preferable to the torture should the following facts:
1) Some people have a very good imagination. I could personally think of at least 100 gradations between a dust speck and a toe stubbing, 100 more between the toe stubbing and the nipple piercing, and as many as you like between the nipple piercing and the 50 years of torture.
2) Arguing about where to say no, the lesser pain can never add up to the slightly greater pain, would look a lot like creationists arguing about which transitional fossils are merely ape-like humans, and which are merely human-like apes. There is a point in the transitional fossils where the fossil is so intermediate that 50% of the creationists say that it is human, and 50% that it is an ape. Likewise, there will be a point where 50% of the Speckists say that dust specks can add up to this intermediate pain, but the intermediate pain can’t add up to torture, and the other 50% will say that the intermediate pain can add up to torture, but the specks can’t add up the intermediate pain. Do you really want to go down this path?
3) Is your intuition about the specks being preferable to the torture really greater than the intuition you violate by positing such an absolute division? Suppose we go down the path mentioned above, and at some point you say that specks can add up to pain X, but not to pain X+.00001 (a representation of the minute degree of greater pain in the next step if we choose a fine enough division). Do you really want to say that you prefer that a trillion persons (or a google, or a googleplex, etc) than that one person suffer pain X+.00001?
While writing this, Paul just answered no, the specks never add up to a toe stub. This actually suggests that he rounds down the speck to nothing; you don’t even notice it. Remember however that originally Eliezer posited that you feel the irritation for a fraction of a second. So there is some pain there. However, Paul’s answer to this question is simply a step down the path laid out above. I would like to see his answer to the above. Remember the (minimally) 100 gradations between the dust speck and the toe stub.
- 22 Feb 2011 1:46 UTC; 4 points) 's comment on Is Morality a Valid Preference? by (
Caledonian, offering an alternative explanation for the evidence does not imply that it is not evidence that Eliezer expends some resources overcoming bias: it simply shows that the evidence is not conclusive. In fact, evidence usually can be explained in several different ways.
Paul : “Slapping each of 100 people once each is not the same as slapping one person 100 times.”
This is absolutely true. But no one has said that these two things are equal. The point is that it is possible to assign each case a value, and these values are comparable: either you prefer to slap each of 100 people once, or you prefer to slap one person 100 times. And once you begin assigning preferences, in the end you must admit that the dust specks, distributed over multiple people, are preferable to the torture in one individual. Your only alternatives to this will be to contradict your own preferences, or to admit to some absurd preference such as “I would rather torture a million people for 49 years than one person for 50.”
Ben: as I said when I brought up the sand example, Eliezer used dust specks to illustrate the “least bad” bad thing that can happen. If you think that it is not even a bad thing, then of course the point will not be apply. In this case you should simply move to the least thing which you consider to be actually bad.
Eisegetes, would you pull the lever if it would stop someone from being tortured for 50 years, but inflict one day of torture on each human being in the world? And if so, how about one year? or 10 years, or 25? In other words, the same problem arises as with the specks. Perhaps you can defend one punch per human being, but there must be some number of human beings for whom one punch each would outweigh torture.
Salutator, I never said utilitarianism is completely true.
Also: I wonder if Robin Hanson’s comment shows concern about the lack of comments on his posts?
Adam, by that argument the torture is worth 0 as well, since after 1,000,000 years, no one will remember the torture or any of its consequences. So you should be entirely indifferent between the two, since each is worth zero.
Ben: suppose the lever has a continuous scale of values between 1 and 3^^^3. When the lever is set to 1, 1 person is being tortured (and the torture will last for 50 years.). If you set it to 2, two people will be tortured by an amount less the first person by 1/3^^^3 of the difference between the 50 years and a dust speck. If you set it to 3, three people will be tortured by an amount less than the first person by 2/3^^^3 of the difference between the 50 years and the dust speck. Naturally, if you pull the lever all the way to 3^^^3, that number of people will suffer the dust specks.
Will you pull the lever over to 3^^^3? And if so, will you assert that things are getting better during the intermediate stages (for example when you are torturing a google persons by an amount less than the first person by an entirely insignificant quantity?) And if things first get worse and then get better, where does it change?
Will you try to pull the lever over to 3^^^3 if there’s a significant chance the lever might get stuck somewhere in the middle?
To your voting scenario: I vote to torture the terrorist who proposes this choice to everyone. In other words, asking each one personally, “Would you rather be dust specked or have someone randomly tortured?” would be much like a terrorist demanding $1 per person (from the whole world), otherwise he will kill someone. In this case, of course, one would kill the terrorist.
I’m still thinking about the best way to set up the lever to make the point the most obvious.
In general, any claim maintained by even a single human being to be true will be more probable, simply based on the authority of that human being, than some random claim such as the chocolate cake claim, which is not believed by anyone.
There are possibly some exceptions to this (and possibly not), but in general there is no particular reason to include religions as exceptions.