I took the survey.
Unknowns
It actually is not very odd for there to be a difference like this. Given that there are only two sexes, there only needs to be one hormone which is sex determining in that way. Having two in fact could have strange effects of its own.
Time Turners use units of one hour, so at least some kinds of magical items use our units.
Robin Hanson notes that the existence of a stock market can also give rise to an incentive to e.g. bomb a company’s offices, yet such things very rarely actually happen.
Given your description right here (“one of the rapists”, instead of just saying a name was posted), the textual similarities between the webpage and your post about connection theory (such as the numbered lists and the wandering descriptions), and your stated presence in the Bay Area, I am estimating a 95% chance you are the author of the website.
I agree it’s kind of ironic that multi has such an overconfident probability assignment right after criticizing you for being overconfident. I was quite disappointed with his response here.
Argument 2 is correct. When the person replies “Yes,” after choosing randomly, you learn not only that he has the ace of spades, but also that on one trial, he selected it after choosing randomly from his aces. This makes the combinations “AS, 2C” and “AS, 2D” more probable than “AS, AH”, since the first two combinations give a 100% chance of a positive response, while the third gives only a 50% chance of a positive response. So each of the first two combinations is twice as likely as the third, so the probability of the third combination, namely two aces, goes to 1⁄5.
Your prediction is overconfident. Less than 95% of unhappy marriages end in divorce.
“It would be very uncomfortable to have discussions that on the surface appear to be about the nature of reality, but which really are about something else, where the precise value of ‘something else’ is unknown to me.”
Indeed. I agree, although I find it extremely uncomfortable even when the something else is known to me.
For example, once I had a discussion with someone which seemed to be going pretty well, and which fully appeared to be about the nature of reality, and which we were both enjoying. Then at one point in the discussion I said something like “you know, the reason I thought X was true was because of Y”, where X and Y had some reference to another discussion that we had once held and in which we had disagreed.
The person responded, “now you’re ruining everything!!”
Why was I “ruining everything”? The reason is that I misunderstood the point of the discussion. I thought it was about the nature of reality. But, in fact, they simply intended it as a discussion about the relationship between the two of us, and they understood my reference to a discussion in which we disagreed as something harmful to the relationship.
In the end I have come to the very uncomfortable conclusion that at some level, most conversations are like this, and are not about the nature of reality even when they appear to be, and that most people in fact either never or almost never engage in conversations which are actually about the truth of the matter. And the result is that in most conversations I feel like I am speaking with aliens—although the truth may be that the aliens here are the people like us who are actually concerned with reality, and the others are normal human beings.
There is consequently a problem with your four options, although I would say that the third is basically true. It is not that people think that “truth” means something other than “correspondence with reality.” If you ask them what they mean, they will say it means that, and they will disagree with any other definition. But the very discussion about the meaning of truth, is not about the nature of reality, while your attempt to resolve the problems by discussing the nature of truth, is meant to be about reality. So when you engage in this discussion you be at cross purposes, and you will not be able to resolve anything. Nor will you be able to show people that they are unable to have a discussion about the nature of reality; they will be equally and similarly unable to accept that very truth, precisely because they are unable to have a discussion like that.
Basically I think Robin Hanson has it right with his definition of human beings as “homo hypocritus.” In theory people claim accept the correspondence theory of truth, but it is basically hypocrisy, and precisely for that reason, people are unable to have the kind of discussion you want, and they will never understand this nor the reason for it, and you can never explain it to them.
One problem I have with your argument here is that you appear to be saying that if XiXiDu doesn’t agree with you, he must be stupid (the stuff about low g etc.). Do you think Robin Hanson is stupid too, since he wasn’t convinced?
People have already been cured of complete blindness from birth (e.g. http://www.newyorker.com/tech/elements/people-cured-blindness-see ). Not much seems to be revealed by this, especially since in the Mary’s room experiment, she is supposed to know everything there is to know about color in advance, and obviously this is not true in such cases.
Trying to “convince” yourself is doing it wrong. Say it out loud, say it to yourself (and avoid saying the opposite to yourself), and perform the external actions that someone normally does who believes it.
This is how people get “belief in belief” even when it is contrary to their evidence, and it works.
Let’s pick an example. How probable do you think it is that Islam is a true religion? (There are several ways to take care of logical contradictions here, so saying 0% is not an option.)
Suppose there were a machine—for the sake of tradition, we can call it Omega—that prints out a series of zeros and ones according to the following rule. If Islam is true, it prints out a 1 on each round, with 100% probability. If Islam is false, it prints out a 0 or a 1, each with 50% probability.
Let’s run the machine… suppose on the first round, it prints out a 1. Then another. Then another. Then another… and so on… it’s printed out 10 1′s now. Of course, this isn’t so improbable. After all, there was a 1/1024 chance of it doing this anyway, even if Islam is false. And presumably we think Islam is more likely than this to be false, so there’s a good chance we’ll see a 0 in the next round or two...
But it prints out another 1. Then another. Then another… and so on… It’s printed out 20 of them. Incredible! But we’re still holding out. After all, million to one chances happen every day...
Then it prints out another, and another… it just keeps going… It’s printed out 30 1′s now. Of course, it did have a chance of one in a billion of doing this, if Islam were false...
But for me, this is my lower bound. At this point, if not before, I become a Muslim. What about you?
You’ve been rather vague about the probabilities involved, but you speak of “double digit negative exponents” and so on, even saying that this is “conservative,” which implies possibly three digit exponents. Let’s suppose you think that the probability that Islam is true is 10^-20; this would seem to be very conservative, by your standards. According to this, to get an equivalent chance, the machine would have to print out 66 1′s.
If the machine prints out 50 1′s, and then someone runs in and smashes it beyond repair, before it has a chance to continue, will you walk away, saying, “There is a chance at most of 1 in 60,000 that Islam is true?”
If so, are you serious?
You’re even more overconfident than Eliezer. Even he didn’t say that the probability of guilt should be less than 10%.
Also, you ignored the evidence of the scene being rearranged. As far as I can tell, there was substantial evidence of this, and substantial evidence of it being by someone other than RG. This implies substantial evidence that someone else was involved. Even if this doesn’t necessarily imply AK is guilty, it definitely implies a probability higher than the original prior (which itself would be much, much higher than the probability you assign of 1 in a 100,000, given the proximity of the persons).
Basically, you are overconfident if you assign less than 10% chance of guilt. And the fact that your opinion is much more extreme than anyone else’s doesn’t show that you are more rational, but is very strong Bayesian evidence of overconfidence bias on your part, since it is well known that humans are naturally overconfident, not underconfident.
Of course, the AI realizes that it’s programmers did not want it doing what the programmers intended, but what the CEV intended instead, so this response fails completely.
I don’t see how this study does any good unless first they measure the rate at which people actually match the stereotypical preconceptions and then compare this with the two average ratings. Otherwise it is possible the people were becoming less biased, not more.
It is clear that if enough people use this method, prices published for a two part flight will stop being cheaper than prices published for one part of that flight. It isn’t clear one way or another (at least to me) whether or not the overall consequences would be worse.
A more common example of the same thing: it is frequently the case that a round trip ticket is cheaper than a one-way ticket, and for this reason it is not uncommon for people to buy a round trip ticket and only use the first half. If people did this frequently enough, you would no longer see round trip tickets that are cheaper than one-way tickets. It is not obvious at all that things would be overall worse for people, rather than better.
I assign a 99.999999% probability to the same thing, i.e. that there are more male readers in the world, than there are male readers of LW in the world.
Scott Alexander.