I wonder if this idea comes as a shock because everyone was planning on becoming rationality instructors, i.e., I should have warned everyone about this much earlier?
Is it offputting on some other level?
But I must also consider that it might really be that stupid. Damn, now I wish I knew the actual number of upvotes and downvotes!
I don’t think too many people are actually considering “rationality instructor” as a career path at this point—which reminds me—what exactly are your plans for this rationality dojo thing anyway? Is it just something you like to talk about, or something you plan to one day set up? Are you hoping people from Less Wrong will start the first ones, or that people from Less Wrong will be students in ones set up in some other way?
When you (or anyone) says “rationality dojo”, how literally is it meant? Is it specifically a physical meeting, rather than a web community? More literally, is this meant as a meeting of equals or an instructor with pupils? How much of the formalism of the dojo would you import? How would you change the relationship of sensei and pupil? I’m not so sure about wearing robes, and I draw the line at getting thwacked on the head with a stick.
I am keen to increase the number of rational people, but there are a great many means by which a thing passes from mind to mind, and I’m not sure a dojo would be the first model I’d reach for—have I missed a post where this model is set out in more detail?
I don’t think I can afford to divert my attention into setting one up, but I’ve heard others already discussing it, so it’s worth placing some Go stones around it.
I don’t know if I’m part of who Eliezer heard, but I’m planning on trying to start a rationality training group on Saturdays in the SF bay area, for middle and high school students with exceptional mathematical ability. I want to create a community that thinks about thinking, considers which kinds of thinking work for particular tasks (e.g., scientific progress; making friends), and learns to think in ways that work. The reason I’m focusing on kids with exceptional mathematical ability is that I’m hoping some of them will go on to do the kind of careful science humanity needs, with the rationality to actually see what actually helps. The aim is not so much to teach rationality knowledge, since AFAICT the “art of human rationality” is mostly a network of plausible guesswork at this point, but to get people aiming, experimenting, measuring, and practicing in a community, sharing results, trying to figure out what works and actually trying the best ideas (real practice; community resistance to akrasia). With some mundane math teaching mixed in.
As to “day job” credentials, I’ve had unusual success teaching mathematical thinking (does this count as “day job”? at least math teaching success is measurable by, say, the students’ performance on calculus exams), bachelor degrees in math and “great books”, and two or three years’ experience doing scientific research in various contexts. I don’t know if this would put me above or below Eliezer’s suggested bar to a stranger.
My secret identity just says that some combination of you and Michael Vassar thought I was worth taking a chance on. I was trying to do some analog of cross-validation, where we ask whether someone who was basically following your procedure but who didn’t know me or have particular faith in your or Michael Vassar’s judgment, would think it okay for me to try teaching. I was figuring that your focus on day job impressiveness was an attempt to get away from handed-down lineages of legitimacy / “true to the Real Teachings”ness, which Objectivism or martial arts traditions or religions sometimes degenerate into.
I wonder if this idea comes as a shock because everyone was planning on becoming rationality instructors, i.e., I should have warned everyone about this much earlier?
FWIW, part of my own emotional reaction to it did come from that, though I noticed and have my reaction tagged as an emotionally contaminated thing to be wary of.
I had hoped to become a rationality instructor of some stripe, but with an apprentice period as an experimental physicist, in order to give concreteness to my teaching.
Speaking only for myself—I am here, consciously and explicitly, to learn rationality for its own benefits. I have no overwhelming interest in teaching others and, all else equal, have other things I would prefer to be doing with my life.
I didn’t vote either way on the post because I am ambivalent to it. It felt underdeveloped compared to your usual material, and to some extent seems like you’re getting ahead of yourself on this “teaching rationality” thing—the current understanding of applied rationality in this community here doesn’t seem to justify raising the concern yet.
Perhaps the idea would have been better presented in the context of one of your parables/short stories/&c.?
Thales, so the story goes, because of his poverty was taunted with the uselessness of philosophy; but from his knowledge of astronomy he had observed while it was still winter that there was going to be a large crop of olives, so he raised a small sum of money and paid round deposits for the whole of the olive-presses in Miletus and Chios, which he hired at a low rent as nobody was running him up; and when the season arrived, there was a sudden demand for a number of presses at the same time, and by letting them out on what terms he liked he realized a large sum of money, so proving that it is easy for philosophers to be rich if they choose, but this is not what they care about.
Huh? There is no way that knowledge of astronomy could possibly have told him about the olive crop. It seems more likely that his useful knowledge was of economics and business, but that he made up a story about astronomy to impress his peers.
This is a good example of what I meant over in the evolutionary psychology thread; coming up with evolutionary psychology explanations is a good practice to avoiding succumbing to ‘arguments from incredulity’, as I like to call this sort of comment.
“Oh, I couldn’t think of how astronomy could possibly be useful in weather or crop forecasting, so I’ll just assume the stories about Thales are a lie.”
“Farmers in drought-prone regions of Andean South America have historically made observations of changes in the apparent brightness of stars in the Pleiades around the time of the southern winter solstice in order to forecast interannual variations in summer rainfall and in autumn harvests. They moderate the effect of reduced rainfall by adjusting the planting dates of potatoes, their most important crop1. Here we use data on cloud cover and water vapour from satellite imagery, agronomic data from the Andean altiplano and an index of El Niño variability to analyse this forecasting method. We find that poor visibility of the Pleiades in June—caused by an increase in subvisual high cirrus clouds—is indicative of an El Niño year, which is usually linked to reduced rainfall during the growing season several months later. Our results suggest that this centuries-old method2 of seasonal rainfall forecasting may be based on a simple indicator of El Niño variability.”
I read it as an injunction to focus on fixing my own rationality and making best use of it, and not to think about how to help other people be more rational. That runs entirely contrary to my own hopes for making the world a better place. If all you mean is “spread rationality, but keep the day job” then absolutely, I’m keeping the day job, it pays better.
The idea of a rationality pressure group has crossed my mind, but if I were to work for such a thing it would not be in the role of instructor, and I could probably do more for such an organisation by keeping the day job and giving it money in any case.
It’s an idea that is common among writers (with respect to writing instructors). Not the secret identity part, though.
Eliezer’s idea is a bit different, because success in any area of life should indicate rationality.
I don’t understand the secret identity part. If one identity is secret, how are students supposed to know whether to respect the instructor for accomplishments under his/her non-instructor identity?
(If you’re a rationality instructor or practitioner, having a secret identity is probably a good idea anyway, so you’re not the first against the wall when the religious-Luddite anti-transhuman pogrom begins.)
The idea never occurred to me—not when I was sincerely involved in martial arts, and not since becoming sincerely dedicated to rationality. I’d be quite surprised if it has occurred to more than a few people here.
Perhaps few readers are thinking about becoming rationality instructors, so they feel it doesn’t apply to them. That would likely diminish their estimation of its importance.
(Blinks.)
I wonder if this idea comes as a shock because everyone was planning on becoming rationality instructors, i.e., I should have warned everyone about this much earlier?
Is it offputting on some other level?
But I must also consider that it might really be that stupid. Damn, now I wish I knew the actual number of upvotes and downvotes!
I don’t think too many people are actually considering “rationality instructor” as a career path at this point—which reminds me—what exactly are your plans for this rationality dojo thing anyway? Is it just something you like to talk about, or something you plan to one day set up? Are you hoping people from Less Wrong will start the first ones, or that people from Less Wrong will be students in ones set up in some other way?
When you (or anyone) says “rationality dojo”, how literally is it meant? Is it specifically a physical meeting, rather than a web community? More literally, is this meant as a meeting of equals or an instructor with pupils? How much of the formalism of the dojo would you import? How would you change the relationship of sensei and pupil? I’m not so sure about wearing robes, and I draw the line at getting thwacked on the head with a stick.
I am keen to increase the number of rational people, but there are a great many means by which a thing passes from mind to mind, and I’m not sure a dojo would be the first model I’d reach for—have I missed a post where this model is set out in more detail?
I don’t think I can afford to divert my attention into setting one up, but I’ve heard others already discussing it, so it’s worth placing some Go stones around it.
Really? If it’s not too private, who’s been discussing it?
I don’t know if I’m part of who Eliezer heard, but I’m planning on trying to start a rationality training group on Saturdays in the SF bay area, for middle and high school students with exceptional mathematical ability. I want to create a community that thinks about thinking, considers which kinds of thinking work for particular tasks (e.g., scientific progress; making friends), and learns to think in ways that work. The reason I’m focusing on kids with exceptional mathematical ability is that I’m hoping some of them will go on to do the kind of careful science humanity needs, with the rationality to actually see what actually helps. The aim is not so much to teach rationality knowledge, since AFAICT the “art of human rationality” is mostly a network of plausible guesswork at this point, but to get people aiming, experimenting, measuring, and practicing in a community, sharing results, trying to figure out what works and actually trying the best ideas (real practice; community resistance to akrasia). With some mundane math teaching mixed in.
As to “day job” credentials, I’ve had unusual success teaching mathematical thinking (does this count as “day job”? at least math teaching success is measurable by, say, the students’ performance on calculus exams), bachelor degrees in math and “great books”, and two or three years’ experience doing scientific research in various contexts. I don’t know if this would put me above or below Eliezer’s suggested bar to a stranger.
How has this project been going?
You’re focusing on easy-to-verify credentials of the sort you’d list on a resume to be hired by some skeptical HR person. You have a secret identity.
My secret identity just says that some combination of you and Michael Vassar thought I was worth taking a chance on. I was trying to do some analog of cross-validation, where we ask whether someone who was basically following your procedure but who didn’t know me or have particular faith in your or Michael Vassar’s judgment, would think it okay for me to try teaching. I was figuring that your focus on day job impressiveness was an attempt to get away from handed-down lineages of legitimacy / “true to the Real Teachings”ness, which Objectivism or martial arts traditions or religions sometimes degenerate into.
More of an attempt to make sure that people write instead of just doing literary criticism.
Got it. Sorry; I think I rounded you to the nearest cliche, maybe because of the emotional reaction you suggested some of us might be having.
FWIW, part of my own emotional reaction to it did come from that, though I noticed and have my reaction tagged as an emotionally contaminated thing to be wary of.
I had hoped to become a rationality instructor of some stripe, but with an apprentice period as an experimental physicist, in order to give concreteness to my teaching.
So, no particular degree of shock here.
Speaking only for myself—I am here, consciously and explicitly, to learn rationality for its own benefits. I have no overwhelming interest in teaching others and, all else equal, have other things I would prefer to be doing with my life.
I didn’t vote either way on the post because I am ambivalent to it. It felt underdeveloped compared to your usual material, and to some extent seems like you’re getting ahead of yourself on this “teaching rationality” thing—the current understanding of applied rationality in this community here doesn’t seem to justify raising the concern yet.
Perhaps the idea would have been better presented in the context of one of your parables/short stories/&c.?
Huh? There is no way that knowledge of astronomy could possibly have told him about the olive crop. It seems more likely that his useful knowledge was of economics and business, but that he made up a story about astronomy to impress his peers.
This is a good example of what I meant over in the evolutionary psychology thread; coming up with evolutionary psychology explanations is a good practice to avoiding succumbing to ‘arguments from incredulity’, as I like to call this sort of comment.
“Oh, I couldn’t think of how astronomy could possibly be useful in weather or crop forecasting, so I’ll just assume the stories about Thales are a lie.”
I’ll leave this here for you.
″ Forecasting Andean rainfall and crop yield from the influence of El Niño on Pleiades visibility”, Nature 403, 68-71 (6 January 2000):
I read it as an injunction to focus on fixing my own rationality and making best use of it, and not to think about how to help other people be more rational. That runs entirely contrary to my own hopes for making the world a better place. If all you mean is “spread rationality, but keep the day job” then absolutely, I’m keeping the day job, it pays better.
The idea of a rationality pressure group has crossed my mind, but if I were to work for such a thing it would not be in the role of instructor, and I could probably do more for such an organisation by keeping the day job and giving it money in any case.
It’s an idea that is common among writers (with respect to writing instructors). Not the secret identity part, though.
Eliezer’s idea is a bit different, because success in any area of life should indicate rationality.I don’t understand the secret identity part. If one identity is secret, how are students supposed to know whether to respect the instructor for accomplishments under his/her non-instructor identity?
(If you’re a rationality instructor or practitioner, having a secret identity is probably a good idea anyway, so you’re not the first against the wall when the religious-Luddite anti-transhuman pogrom begins.)
He’s joking about the secret part—think “day job”
The idea never occurred to me—not when I was sincerely involved in martial arts, and not since becoming sincerely dedicated to rationality. I’d be quite surprised if it has occurred to more than a few people here.
Perhaps few readers are thinking about becoming rationality instructors, so they feel it doesn’t apply to them. That would likely diminish their estimation of its importance.