I have felt for a long time that LW is short of discussion of what you might call “collective rationality”: the art of effective collaborative truth-seeking. Of course LW is itself an attempt at collective rationality; but most of us, much of the time, are engaged in activities that (1) involve multiple people, (2) would benefit from better truth-finding techniques, and (3) are not Less Wrong.
It seems to me that industrial organization and industrial psychology have put quite a bit of effort into asking how to get committees and groups to think together effectively. Perhaps someone could do a literature survey / find some good books to review for LW?
If Mercier and Sperber’s theory is correct, people are already optimized for arguing things out in groups ..which would mean that rationality training is really solo rationality training...and perhaps not that useful for many people.
If Mercier and Sperber’s theory is correct, people are already optimized for arguing things out in groups
Not really, no. People are optimized for winning arguments against untrained humans. The point of group rationality training is figuring out what norms / individual training / etc. makes it so that the best ideas (by some external metric) are most likely to win in a group discussion, rather than the best-championed ideas. Even if, say, I can identify why someone’s argument is not helping push towards truth, there needs to be a group norm that I can call them out on that and that will be effective. (Think of “Objection!” or pointing out fallacies in debate club; both of those rest on the common acceptance on what things are worth objecting to or calling fallacious.)
The average person isn’t as well optimized at group debate that the best debates, but people are still optimized for group debate in the sense of individual pondering.
Of course LW is itself an attempt at collective rationality
In particular, it seems like it is a remarkably unexamined, unplanned attempt. Surely we’ve learned some ways to improve it. Surely there are better approaches out there than “hey, Reddit seems to work ok, let’s modify a couple things, call it good, and leave it alone for a while”.
Not that I know how to improve it. Predictably, I have a few complaints and a few minor tweaks to suggest, but I’d really prefer a more evidence-based approach than that. Actually, I don’t even really know what process I would advocate for improving LW, let alone what the actual improvements would be that would come from that process.
There is plenty of talk, less data, and only very tiny amounts of tested changes. Surely the rationalist approach to solving a problem like this should involve empirical examination, not just armchair discussions.
I have felt for a long time that LW is short of discussion of what you might call “collective rationality”: the art of effective collaborative truth-seeking. Of course LW is itself an attempt at collective rationality; but most of us, much of the time, are engaged in activities that (1) involve multiple people, (2) would benefit from better truth-finding techniques, and (3) are not Less Wrong.
It seems to me that industrial organization and industrial psychology have put quite a bit of effort into asking how to get committees and groups to think together effectively. Perhaps someone could do a literature survey / find some good books to review for LW?
If Mercier and Sperber’s theory is correct, people are already optimized for arguing things out in groups ..which would mean that rationality training is really solo rationality training...and perhaps not that useful for many people.
Not really, no. People are optimized for winning arguments against untrained humans. The point of group rationality training is figuring out what norms / individual training / etc. makes it so that the best ideas (by some external metric) are most likely to win in a group discussion, rather than the best-championed ideas. Even if, say, I can identify why someone’s argument is not helping push towards truth, there needs to be a group norm that I can call them out on that and that will be effective. (Think of “Objection!” or pointing out fallacies in debate club; both of those rest on the common acceptance on what things are worth objecting to or calling fallacious.)
The average person isn’t as well optimized at group debate that the best debates, but people are still optimized for group debate in the sense of individual pondering.
Definitely agreed.
In particular, it seems like it is a remarkably unexamined, unplanned attempt. Surely we’ve learned some ways to improve it. Surely there are better approaches out there than “hey, Reddit seems to work ok, let’s modify a couple things, call it good, and leave it alone for a while”.
Not that I know how to improve it. Predictably, I have a few complaints and a few minor tweaks to suggest, but I’d really prefer a more evidence-based approach than that. Actually, I don’t even really know what process I would advocate for improving LW, let alone what the actual improvements would be that would come from that process.
As far as I see we do have plenty of meta discussion that examine LW.
There is plenty of talk, less data, and only very tiny amounts of tested changes. Surely the rationalist approach to solving a problem like this should involve empirical examination, not just armchair discussions.
LW isn’t very big and as such it’s not clear whether there are strong returns to experimenting with software changes.