I went to some martial arts class, jiu jitsu, and before they taught me anything else they taught me how to break falls safely. Same with parkour class. You’re going to fall, they said. You need a way to catch yourself without fucking up your arms or back. It’s not just as mistakes when you’re learning a new move, either, though it will certainly happen more often then. You’re throwing yourself all over the place, tripping each other; you’re going to hit the ground at momentum. You need to know how to handle yourself when that happens, how to roll with it and get up right after safe and sound. Every class, the first thing we do is drill break falls.
I don’t think The Art of Rationality has that.
Yes we notice the skulls. It seems like I see a new treatise pointing out the valley of bad rationality every few months. And yet...
When you share what you know, do you share safety skills and warnings with it?
Do you have a sense of how likely are you to injure yourself in your practice?
What specific actions do you take when you notice you’re taking epistemic damage?
How strong are your skills in harm-minimization? Do you have it down to ingrained reaction or habit?
Do you practice locating individual personal abilities + limits with the distribution of expected human traits as a guide, or are you fitting your strategies to a population-level statistic?
I think we most certainly do. A lot of the early posts by Eliezer contain such warnings, justifiably so if you look at the comments sections of those early Overcoming Bias articles. There are a lot of warnings against using what you’ve learned as a fully general excuse in argumentation, for example.
To continue with the martial arts analogy, which is apt, people who have read the sequences and share that background knowledge are black belt rationalists. Now the black belt isn’t an award for mastery, it is an indication to the other practice partners in training that you have enough background that the gloves can come off without risk of hurting yourself. It’s when the real training begins.
LessWrong, today, is a club for black belt rationalists. We don’t need warnings and disclaimers because there is an assumed level of competence. But to someone new? We point them to the sequences and ask them to come back after they’ve absorbed that. Without that background we would absolutely need more warnings and liability waivers.
Edit: I should probably mention that I don’t think rationality like martial arts by analogy. Rather, rationality IS a martial art. They’re both training the neural nets inside our brains to take action based on evidence available with an intent to win. There’s a reason Eliezer often quotes Miyamoto Musashi. The only thing that is different is what that “winning” represents, combat vs. life in general. There’s a lot that one can learn by cross-training in both arts. I was very fortunate that my instructor in the martial arts was himself an accomplished rationalist. We talked as much about heuristics and biases as we did muscles and pressure points.
I went to some martial arts class, jiu jitsu, and before they taught me anything else they taught me how to break falls safely. Same with parkour class. You’re going to fall, they said. You need a way to catch yourself without fucking up your arms or back. It’s not just as mistakes when you’re learning a new move, either, though it will certainly happen more often then. You’re throwing yourself all over the place, tripping each other; you’re going to hit the ground at momentum. You need to know how to handle yourself when that happens, how to roll with it and get up right after safe and sound. Every class, the first thing we do is drill break falls.
I don’t think The Art of Rationality has that.
Yes we notice the skulls. It seems like I see a new treatise pointing out the valley of bad rationality every few months. And yet...
When you share what you know, do you share safety skills and warnings with it?
Do you have a sense of how likely are you to injure yourself in your practice?
What specific actions do you take when you notice you’re taking epistemic damage?
How strong are your skills in harm-minimization? Do you have it down to ingrained reaction or habit?
Do you practice locating individual personal abilities + limits with the distribution of expected human traits as a guide, or are you fitting your strategies to a population-level statistic?
I have some ideas.
I wanna hear yours.
I think we most certainly do. A lot of the early posts by Eliezer contain such warnings, justifiably so if you look at the comments sections of those early Overcoming Bias articles. There are a lot of warnings against using what you’ve learned as a fully general excuse in argumentation, for example.
To continue with the martial arts analogy, which is apt, people who have read the sequences and share that background knowledge are black belt rationalists. Now the black belt isn’t an award for mastery, it is an indication to the other practice partners in training that you have enough background that the gloves can come off without risk of hurting yourself. It’s when the real training begins.
LessWrong, today, is a club for black belt rationalists. We don’t need warnings and disclaimers because there is an assumed level of competence. But to someone new? We point them to the sequences and ask them to come back after they’ve absorbed that. Without that background we would absolutely need more warnings and liability waivers.
Edit: I should probably mention that I don’t think rationality like martial arts by analogy. Rather, rationality IS a martial art. They’re both training the neural nets inside our brains to take action based on evidence available with an intent to win. There’s a reason Eliezer often quotes Miyamoto Musashi. The only thing that is different is what that “winning” represents, combat vs. life in general. There’s a lot that one can learn by cross-training in both arts. I was very fortunate that my instructor in the martial arts was himself an accomplished rationalist. We talked as much about heuristics and biases as we did muscles and pressure points.