I remember one conversation at a LessWrong community weekend where I made a contrarian argument. The other person responded with something like “I don’t know the subject matter well enough to judge your arguments, I rather stay with believing the status quo. The topic isn’t really relevant enough for my to invest time into it.”
That’s the kind of answer you can get when speaking with rationalists but you don’t really get when talking to non-rationalists. That person wasn’t “glad to learn that they were wrong” but they were far from irrational. They had an idea about their own beliefs and how it makes sense to change their own beliefs that was the result of reasoning in a way that non-rationalist don’t tend to do.
Adam sounds to me naive about what goes into actually changing your mind. He seems to take “learn that you were wrong” in a goal in itself. The person I was speaking about in the above example didn’t have a goal to have a sophisticated understanding of the domain I was talking about that’s was probably completely in line with their utility function.
When it comes to issues where it’s actually important to change your mind, it’s complex in another way. Someone might give you a convincing rational argument but in the back of your mind there’s a part of you that feels wary. While you could ignore that part at the back of your mind and just update your belief, it’s not clear that this is always the best idea.
There are a few people who were faced with pretty convincing arguments about the central importance of AI safety and that it’s important for them to do everything they can to fight for AI safety. Then a year later, they have burnout because they invest all their energy into AI safety. They ignored a part of themselves and their ability to change their mind turned to their determinant. A lot of what CFAR did when it comes to Focusing and internal double crux is about listening to more internal information instead of suppressing it.
Another problem when it comes to teaching rationality is that even if someone does the right thing 99% but the 1% they are doing the wrong thing when it actually matters, the result is still a failure. Just because someone can do it in the dojo where they train kata’s doesn’t mean that they can do it when it’s actually important.
Julia Galef had the Scout vs. Soldier mindset as one alternative to the paradigm of teaching individual skills. The idea is that the problem often isn’t that people lack the skills but that they are in soldier mindset and thus don’t use skills they have.
I remember one conversation at a LessWrong community weekend where I made a contrarian argument. The other person responded with something like “I don’t know the subject matter well enough to judge your arguments, I rather stay with believing the status quo. The topic isn’t really relevant enough for my to invest time into it.”
That’s the kind of answer you can get when speaking with rationalists but you don’t really get when talking to non-rationalists. That person wasn’t “glad to learn that they were wrong” but they were far from irrational. They had an idea about their own beliefs and how it makes sense to change their own beliefs that was the result of reasoning in a way that non-rationalist don’t tend to do.
Adam sounds to me naive about what goes into actually changing your mind. He seems to take “learn that you were wrong” in a goal in itself. The person I was speaking about in the above example didn’t have a goal to have a sophisticated understanding of the domain I was talking about that’s was probably completely in line with their utility function.
When it comes to issues where it’s actually important to change your mind, it’s complex in another way. Someone might give you a convincing rational argument but in the back of your mind there’s a part of you that feels wary. While you could ignore that part at the back of your mind and just update your belief, it’s not clear that this is always the best idea.
There are a few people who were faced with pretty convincing arguments about the central importance of AI safety and that it’s important for them to do everything they can to fight for AI safety. Then a year later, they have burnout because they invest all their energy into AI safety. They ignored a part of themselves and their ability to change their mind turned to their determinant. A lot of what CFAR did when it comes to Focusing and internal double crux is about listening to more internal information instead of suppressing it.
Another problem when it comes to teaching rationality is that even if someone does the right thing 99% but the 1% they are doing the wrong thing when it actually matters, the result is still a failure. Just because someone can do it in the dojo where they train kata’s doesn’t mean that they can do it when it’s actually important.
Julia Galef had the Scout vs. Soldier mindset as one alternative to the paradigm of teaching individual skills. The idea is that the problem often isn’t that people lack the skills but that they are in soldier mindset and thus don’t use skills they have.