@PhilGoetz’s Reason as memetic immune disorder seems relevant here. It has been noted many times that engineers are disproportionately involved in terrorism, in ways that the mere usefulness of their engineering skills can’t explain.
Perhaps there should be some “pre-rationality” lessons. Something stabilizing you need to learn first, so that learning about rationality does not make you crazy.
There are some materials that already seem to point in that direction: adding up to normality, ethical injunctions. Perhaps the CFAR workshops should start with focusing on these things, in a serious way (like, spend at least one day only debating this, check that the participants understood the lesson, and maybe kick out those who didn’t?).
Because, although some people get damaged by learning about rationality, it seems to me that many people don’t (some of them only because they don’t change in any significant way, but some of them internalize the lessons in a good way). If we could predict who would end up which way, that could allow us to reduce the damage, while still delivering the value.
Of course this only applies to the workshops; online communication is a different questions. But seems to me that the bad things mostly happen offline.
@PhilGoetz’s Reason as memetic immune disorder seems relevant here. It has been noted many times that engineers are disproportionately involved in terrorism, in ways that the mere usefulness of their engineering skills can’t explain.
Teaching rationality the shallow way—nope; knowing about biases can hurt people
Teaching rationality the deep way—nope; reason as a memetic immune disorder
:(
Perhaps there should be some “pre-rationality” lessons. Something stabilizing you need to learn first, so that learning about rationality does not make you crazy.
There are some materials that already seem to point in that direction: adding up to normality, ethical injunctions. Perhaps the CFAR workshops should start with focusing on these things, in a serious way (like, spend at least one day only debating this, check that the participants understood the lesson, and maybe kick out those who didn’t?).
Because, although some people get damaged by learning about rationality, it seems to me that many people don’t (some of them only because they don’t change in any significant way, but some of them internalize the lessons in a good way). If we could predict who would end up which way, that could allow us to reduce the damage, while still delivering the value.
Of course this only applies to the workshops; online communication is a different questions. But seems to me that the bad things mostly happen offline.