This does not seem like an evolutionarily stable strategy.
pnrjulius
Do not copy the Blind Idiot God, for it lives much longer than you, and is a blind idiot.
The point is that they are giving a wrong answer to confuse you, to see if you really believe in Bayes’s Theorem or if instead you will just capitulate to the word of an authority.
I can’t see any flaws in the argument, but the conclusion is far more radical than most of us would be willing to admit.
Am I the sort of person who would value my computer over another human being’s life? I hope not, that makes me sound like the most horrible sort of psychopath—it is basically the morality of Stalin. But at the same time, did I sell my computer to feed kids in Africa? I did not. Nor did any of you, unless you are reading this at a library computer (in which case I’m sure I can find something you could have given up that would have allowed you to give just a little bit more to some worthy charitable cause.)
It gets worse: Is my college education worth the lives of fifty starving children? Because I surely paid more than that. Is this house I’m living in worth eight hundred life-saving mosquito nets? Because that’s how much it cost.
Our entire economic system is based on purchases that would be “unjustified”—even immoral—on the view that every single purchase must be made on this kind of metric. And so if we all stopped doing that, our economy would collapse and we would be starving instead.
I think it comes down to this: Consequentialism is a lot harder than it looks. It’s not enough to use the simple heuristic, “Is this purchase worth a child’s life?”; no, you’ve got to carry out the full system of consequences—in principle, propagated to our whole future light cone. (In fact, there’s a very good reason not to ask that question: Because of our socialization, we have a taboo in our brains about never saying that something is worth more than a child—even when it obviously is.) You’ve got to note that once the kid survives malaria, he’ll probably die of something else, like malnutrition, or HIV, or a parasite infection. You’ve got to note that if people didn’t go to college and become scientific researchers, we wouldn’t even know about HIV or malaria or anything else. You’ve got to keep in mind the whole system of modified capitalism and the social democratic welfare state that makes your massive wealth possible—and really, I think you should be trying to figure out how to export it to places that don’t have it, not skimming off the income that drives it to save one child’s life at a time.
And if you think, “Ah ha! We’ll just work for the Singularity then!” well, that’s a start—and you should, in fact, devote some of your time, energy, and money to the Singularity—but it’s not a solution by itself. How much time should you spend trying to make yourself happy? How much effort should you devote to your family, your friends? How important is love compared to what you might be doing—and how much will your effectiveness depend on you being loved? We might even ask: Would we even want to make a Singularity if it meant that no one ever fell in love?
This is why I’m not quite a gung-ho consequentialist. Ultimately consequentialism is right, there can be no doubt about that; but in practical terms, I don’t think most people are smart enough for it. (I’m not sure I’m smart enough for it.) It might be better, actually, to make people follow simple rules like “Don’t cheat, don’t lie, don’t kill, don’t steal”; if everyone followed those rules, we’d be doing all right. (Most of the really horrible things in this world are deontic violations, like tyranny and genocide.) At the very least, the standard deontic rules are better heuristics than asking, “Is it worth the life of a child?”
[citation needed]
Actually I think I tend to do the opposite. I undervalue subgoals and then become unmotivated when I can’t reach the ultimate goal directly.
E.g. I’m trying to get published. Book written, check. Query letters written, check. Queries sent to agents, check. All these are valuable subgoals. But they don’t feel like progress, because I can’t check off the book that says “book published”.
The mutilation of male genitals in question is ridiculous in itself but hardly equivalent to the kind of mutilation done to female genitals.
Granted. Female mutilation is often far more severe.
But I think it’s interesting that when the American Academy of Pediatrics proposed allowing female circumcision that really just was circumcision, i.e. cutting of the clitoral hood, people were still outraged. And so we see that even when the situation is made symmetrical, there persists what we can only call female privilege in this circumstance.
Feynman: FUCK THAT.
The probability of generating THAT SEQUENCE is enormously, nigh-incomprehensibly tiny.
The probability of generating A SEQUENCE LIKE THAT (which appears as patternless, which contains no useful information, which has a very high information entropy) is virtually 1.
If I generated another sequence and it turned out exactly identical to yours, that would indeed be compelling (indeed, almost incontrovertible) evidence that something other than random chance was at work.
Right, and that’s exactly the point. She is your best possible partner—including being sentient, being intelligent, etc. I honestly have trouble seeing what’s wrong with that.
We seem to go to pretty dark places pretty fast once we tell ourselves it’s all right to lie to our children.
Also, while most people do grow out of Santa, they don’t seem to grow out of God; so the dress rehearsal apparently doesn’t ever become a performance.
The point is not that we will know everything someday; we probably won’t. (Indeed, on a certain definition, we already know we can’t, see also Uncertainty Principle, Halting Problem, etc.)
The point is that being unknowable is not a good thing. It’s a very, very, bad thing in fact, because we can’t control what we can’t understand. If we never understand cancer, cancer will keep killing us. If we didn’t understand astronomy, an asteroid could hit us at any time. If we never understand consciousness, we’ll never invent AI.
(Also, your specific example is awful. We know that homosexuality is not unique to humans; in fact it is found in over a thousand species and counting. If it’s not adaptive, it’s got to be vestigial; and in fact it’s probably adaptive. This is also morally irrelevant, but it’s something we do in fact know.)
We don’t need to imagine. We are in exactly this position with respect to consciousness.
In fact, I think that our laws are made precisely by people who don’t want us to go around optimizing our behavior to conform to the laws. Why? Because that prevents them from inserting hidden advantages for the people they like (or more specifically the people who pay them campaign contributions).
There’s simply no way to look at, say, the US tax code, or Dodd-Frank, and think, “These are laws designed to be sensible and consistently followed.” It’s much more obvious from trudging through their verbal muck that these are laws designed to be incomprehensible and strategically broken.
I assume that what you are referring to are some of the laws encountered in the Old Testament, which were part of a legal structure designed to apply to the Israelite nation (and no one else, point of interest). From a Judaism perspective, the law is supposed to apply only to Jews—those who are part of the religion and the race.
Yes, because murder and genocide make perfect sense as long as you restrict them to a particular place and time! And there are such things as “races” and it makes sense for them to be units of moral analysis. And obviously “she must marry her rapist” (Deuteronomy 22:28-29) is a totally sensible rule for an ancient culture, and neither the Greeks nor the Chinese had figured out anything even remotely better by that time period. Yes, obviously, it was totally fair for Moses to be talking about slaughtering the Amalekites (and their children, and their cattle; Deuteronomy 20:16-17) at the same time in history when Demosthenes and Epicurus were debating about the proper form of democratic government. And no one today takes those ideas seriously, and certainly there aren’t millions of Americans who use passages from Leviticus (18:22 and 20:13) to argue against gay marriage.
And of course Jesus came to change the rules; that’s why he put it so plainly in Matthew 5:17-19:
Think not that I am come to destroy the law, or the prophets: I am not come to destroy, but to fulfil. For verily I say unto you, Till heaven and earth pass, one jot or one tittle shall in no wise pass from the law, till all be fulfilled. Whosoever therefore shall break one of these least commandments, and shall teach men so, he shall be called the least in the kingdom of heaven: but whosoever shall do and teach them, the same shall be called great in the kingdom of heaven.
\end{sarcasm}
Witness what religion does to a human mind; it makes an otherwise intelligent and reasonable person defend the obviously indefensible, because they cannot bear to accept the obvious fact that what they were told isn’t true. Suddenly genocide becomes “a different time” and rape becomes “their culture”, because the thought that so many people’s precious beliefs are false is simply too much to bear. Contradictions in holy books are somehow seen as a good thing, because they let you take whatever meaning you want and declare the result infallible (when it’s obvious from basic logic that contradictions in beliefs are always bad).
Of course no religion is harmless. Delusions are never a good thing. Some religions are more harmful than others, I’ll grant you that; but if you want to know why Islam is particularly bad, it’s because it actually follows the book. Jews and Christians have largely given up on the crazy evil books, and so they can behave (mostly) like reasonable human beings. Muslims haven’t, and that’s why they do things like hang gay people and keep women covered head to toe. Confucians are an interesting case, in that their books contain falsehoods, but are not genocidally insane, so that counts for something. Jain are also crazy, but crazy in a way that makes them relatively harmless—like the Amish. So if I could make every Muslim in the world suddenly turn Jain, I would; but I’d rather turn them atheist. What’s more, I find it’s easier to make people atheist, because the rational part of their brains already wants to.
As for what evil means, no, it has nothing to do with religion (other than the obvious fact that religion makes assertions about it, just as religion makes unfounded assertions about literally everything). Evil is found in human suffering, particularly when it could be easily prevented. It is found in death and destruction, especially when we are in a position to avoid them. Am I a utilitarian? Yes, I suppose I am—if you are not, you must be saying that your decisions can’t be made to fit a Von Neumann-Morgenstern decision utility… and isn’t that a lot like saying your decisions are irrational? If you meant to say that human beings rarely engage in intentional evil (accidental and negligent evil is far more common), that’s actually a very good point; but then, this is just one more problem with religion, because religion often asserts that our enemies are servants of demons whose only goals are pure evil.
The net amount of human suffering would be decreased if people abandoned religion altogether. If they continued to believe in religion and stopped being hypocrites, no… I stand by my previous claim. They would burn people like me and most of the rest of Less Wrong at the stake. The war between Catholics and Protestants in Ireland would flare up once again, and really if theology is as important as people say, even Baptists and Methodists should be torturing each other over doctrinal differences. It is only a lack of religious fervor that defends civilization as we know it; and if given the choice between fanaticism and hypocrisy I wholemindedly express my preference for hypocrisy.
Am I the only one who didn’t fall for it, and actually said “Wait, they didn’t say it was THE Frodo, so really it’s probably just some dude named Frodo and he probably wears a size 32⁄30.”? I think it’s actually a result of what we might call the “Ackbar effect” (it’s a trap!); when presented with something that you expect to be an optical illusion, lateral-thinking puzzle, evidence of bias, etc. you immediately question your intuitive response or even force yourself into answering differently from your intuition. “I know those lines don’t look parallel… but it’s an optical illusion, so they must be.” (You could then fool such people by giving them “illusions” that aren’t; e.g. the lines don’t look parallel because they really aren’t parallel.)
I agree completely. If you didn’t read the references or notice the date, the article seems completely legitimate. It makes a couple weird claims (fictional drugs?), but if you didn’t know the literature they wouldn’t necessarily seem any stranger than the actual things people do (like anchoring their estimate of a car’s value to their social security number). Remember that the absurdity heuristic is not a very good mode of reasoning!
So this means that while people who know Less Wrong can have a little inside joke, people who are new to rationalism and behavioral sciences could easily be fooled.
I’m not sure I would call it “oppression”, but it’s clearly true that heterosexual men are by far the MOST controlled by restrictive gender norms. It is straight men who are most intensely shoehorned into this concept of “masculinity” that may or may not suit them, and their status is severely downgraded if they deviate in any way.
If you doubt this, imagine a straight man wearing eye shadow and a mini-skirt. Compare to a straight woman wearing a tuxedo.
See the difference?
But we know that he was unusual: He has a very high IQ. This by itself raises the probability of being a math crank (it also raises the probability of being a mathematician of course).
It’s similar to how our LW!Harry Potter has increased chances of being both hero and Dark Lord.
I really like “The Eraser of Mistakes Undone” for some reason.
We should name all our mundane magic this way. “The Car of Traveling” “The Airplane of Flying Metal” “The Laptop of Encapsulated Thought”