The series on free will, probably, because it short-circuits a lot of fruitless contemplation of the meaning of life and such.
Nominull
The stuff on quantum mechanics. I don’t know that I’m a representative sample, though, because I know physicists.
We can easily construct a hypothetical truel worth fighting: consider the scenario of Three Worlds collide, only the three worlds have more comparable technology levels.
Poor guy. It was a reasonable comment… I upvoted you out of sympathy.
Sometimes you buy more than one thing at a time. The heuristic might be something along the lines of, if that store is 33% cheaper on that calculator it might be 33% cheaper on other things, and you might end up saving a lot more than $5. If the calculator is only 4% cheaper, your savings will only work out to near $5, which might not be worth your 20 minutes.
How related is this to the skill of “biting bullets”, of accepting the unintuitive consequences of one’s theories? (Robin Hanson is good at this; most memorably when he came out in support of a height tax as a consequence of optimal taxation theory) I had thought that a good rationalist should eat bullets for breakfast, but it seems to me now that this phenomenon is closer to what we should seek out. It occurs to me that sometimes your intuition is telling you important things.
If you know that the elect tend to live lives of virtue for various reasons, then the play is to live a life of virtue. If instead you know that the elect tend to live lives of virtue because they naturally want to, you may as well sin. I’m not familiar enough with Calvinism to say which of these is the case. (Well, neither is the case, of course. To say which they believe.)
The thing is, I could just as easily be one of the ten as the eleventh (actually, ten times as easily), so it’s in my interests to support a norm where the eleventh sacrifices for the good of the ten. I am in very little danger of starving to death in Africa.
It’s not pleasant, but it is true.
The absurdity heuristic does work well. Almost every possible absurd claim is false. Like most heuristics, it only becomes a problem when you continue using it outside its realm of usefulness.
A member of the sampling bias embarrassedly raising his hand here. I’ll go post in the other comments section, please don’t yell at me ;_;
EDIT: Reading good science fiction, where the heroes win by being rational, might help.
When I was a little kid we would take car trips to visit my grandparents, and my father would borrow books on tape from the library. He borrowed Asimov’s “I, Robot”, which if you haven’t read it is basically “House, M.D.” except that instead of people you have robots and instead of Dr. House you have a pair of underpaid robot repairmen. It didn’t introduce any concepts of rationality directly, but in the book the heroes won by figuring things out, rather than by being strong or passionate or morally correct. It made figuring things out cool, and it turns out that if you want to figure things out, you use rationality.
Learning about the halo and horns biases has helped me make more accurate predictions about people’s actions, realize that my friends are terrible people and my enemies are pretty cool.
Hearing Eliezer’s solutions to philosophical problems has made me stop wasting so much time on those philosophical problems, which is an advantage I’ve gained from Overcoming Bias, but not really from increased rationality.
Are you so sure that good art isn’t destructive? It makes the rest of the world seem bland in comparison. When I started reading, at Eliezer’s recommendation, the classic work of literature Fate/Stay Night, I found I had trouble reading other, lesser books, because, who cares?
An analogy might be to the Nymphs of Dungeons and Dragons—people who gazed upon their beauty would go blind or even die from despair that they would never see something so beautiful again.
By the early teens I would think that most of the battle has been won or lost. I recall that in my early teens I picked up Ayn Rand’s Fountainhead. My father wanted to stop me, but his concern was unnecessary—I put it down again soon. My mind already had built up defenses against poisoned data.
Encyclopedia Brown is an especially bad example. Most of the mysteries he solves, he solves by knowing some piece of minor trivia which contradicts some off-hand statement of the criminal. This promotes “rationality” as “knowing a lot of facts”, which is absolutely not what we’re trying to promote here, and provides the wrong model of problem solving. Encyclopedia Brown is based on formal logic, not Bayesian probability.
We’re running up against the equivocation at the core of this community, between rationalists as people who make optimal plays versus rationalists as people who love truth and hate lies.
I agree!
Heh.
EDIT: Okay, reversed stupidity is not intelligence, I have no right to feel self-satified at how incoherent this posting is. Damn, this stuff is hard.
Maybe you should read something written by somebody else sometime.
It’s hard to answer this question, given how much of your philosophy I have incorporated wholesale into my own, but I think it’s the fundamental idea that there are Iron Laws of evidence, that they constrain exactly what it is reasonable to believe, and that no mere silly human conceit such as “argument” or “faith” can change them even in the millionth decimal place.