That’s a good lesson to internalize, but how do you get someone to internalize it? How do you explain it (in five minutes or less) in such a way that someone can actually use it?
I’m not saying that there’s no easy way to explain it; I just don’t know what that way would be. When I argue with someone who acts like their intuitions are magic, I usually go back to basic epistemology: define concisely what it means to be right about whatever we’re discussing, and show that their intuitions here aren’t magic. If there’s a simple way to explain in general that intuition isn’t magic, I’d really love to hear it. Any ideas?
Given that we haven’t constructed a decent AI, and don’t know how those intuitions actually work, we only really believe they’re not magic on the grounds that we don’t believe in magic generally, and don’t see any reason why intuitions should be an exception to the rule that all things can be explained.
Perhaps an easier lesson is that intuitions can sometimes be wrong, and it’s useful to know when that happens so we can correct for it. For example, most people are intuitively much more afraid of dying in dramatic and unusual ways (like air crashes or psychotic killers) than in more mundane ways like driving the car or eating unhealthy foods, Once it’s established that intuitions are sometimes wrong, the fact that we don’t exactly know how they work isn’t so dangerous to one’s thinking.
Well, I thought Kaj_Sotana’s explanation was good, but the five-minute constraint makes things very difficult. I tend to be so long-winded that I’m not sure I could get across any insight in five minutes, honestly, but you’re right that “Your intuitions are not magic” is likely to be harder than many.
That’s a good lesson to internalize, but how do you get someone to internalize it? How do you explain it (in five minutes or less) in such a way that someone can actually use it?
I’m not saying that there’s no easy way to explain it; I just don’t know what that way would be. When I argue with someone who acts like their intuitions are magic, I usually go back to basic epistemology: define concisely what it means to be right about whatever we’re discussing, and show that their intuitions here aren’t magic. If there’s a simple way to explain in general that intuition isn’t magic, I’d really love to hear it. Any ideas?
Given that we haven’t constructed a decent AI, and don’t know how those intuitions actually work, we only really believe they’re not magic on the grounds that we don’t believe in magic generally, and don’t see any reason why intuitions should be an exception to the rule that all things can be explained.
Perhaps an easier lesson is that intuitions can sometimes be wrong, and it’s useful to know when that happens so we can correct for it. For example, most people are intuitively much more afraid of dying in dramatic and unusual ways (like air crashes or psychotic killers) than in more mundane ways like driving the car or eating unhealthy foods, Once it’s established that intuitions are sometimes wrong, the fact that we don’t exactly know how they work isn’t so dangerous to one’s thinking.
Well, I thought Kaj_Sotana’s explanation was good, but the five-minute constraint makes things very difficult. I tend to be so long-winded that I’m not sure I could get across any insight in five minutes, honestly, but you’re right that “Your intuitions are not magic” is likely to be harder than many.