I suspect literary labels do have something to do with the contents of a book, no matter how much nonsense might be attached to them
I think Eliezer’s point was about the student. “Wulky Wilkinsen is a ‘post-utopian’” could be meaningful, if you know what a post-utopian is and is not (I don’t, and don’t care). The student who learns just the statement, however, has formed a floating belief.
We might even initially use propositional beliefs as indicators of meaningful beliefs about the world. But if we then discuss these highly compressed beliefs without referencing their meaning, we often feel like we are reasoning when really we have ceased to speak about the world. That is, grounded beliefs can become “floaty” and spawn further “floaty” beliefs.
In my sociology class, we talk about how “Man in his natural state has liberty because everyone is equal”. “Natural state”, “liberty”, and “equal” could conceivably be linked to descriptions of social interaction or something. However, class after class we refrain from talking about specific behaviors. Concepts float away from their referents without much resistance—it’s all the same to the student, who only needs to make a few unremarkable remarks to get his B+ for class participation. Compare:
“Man in his natural state has liberty because everyone is equal”
“Man in his natural state is equal because everyone has liberty”
“When everyone has liberty and is equal, man is in his natural state”
These statements should express very different beliefs about the world, but to the student they sound equally clever coming out of the professor’s mouth.
(Edit for minor grammar and formatting)
Shouldn’t the presentation suggest what sort of reasoning is actually going on in someone’s head while they are committing the sunk costs fallacy? For example, (skill 1, problem 12) Peggy is at a baseball game with her son Tim, who is bored and wants to leave. Peggy says, “You want to leave? Those tickets were expensive!”. Her expressed thoughts are a fine example of the sunk costs fallacy. But she may be really thinking: “This is event is the sort of thing normal people pay a lot to attend—Tim must not be thinking clearly. We should stay because he will enjoy the game, even though he says he’s bored.” If the presentation pointed out that (unconsciously) applying the scarcity heuristic can lead to or sound like the sunk costs fallacy, rationalists-in-training could more easily catch themselves committing the fallacy.
Also, learning to distinguish solid and fallacious reasoning, even when they produce the same conclusion, helps one correctly cash out the Teleporting Alien viewpoint. Tim responds to his mother, “We’ve already spent the money for the baseball tickets. I’m bored so there’s no point in sitting through the game.” Peggy realizes that she is committing the sunk costs fallacy. She tells Tim, “You’re right. But this baseball game is expensive—which makes it likely that it is entertaining in some way. Let’s stay a little longer and see if something exciting happens.”
The obvious problem with trying to fully explain the reasoning in sunk costs situations is that you have to explain those other biases and heuristics. But I think the exercises can briefly mention what to look out for and reference other kata while remaining focused on the particular skill at hand. And indeed, rationality skills are dependent, which should be reflected even in a training course with a narrow scope.