His metaphors serve as useful intuition pumps for ideas that are highly abstract and far removed from what the average layperson has thought about prior. When I started reading about AI risk, many of the ideas I was introduced to, like optimization and orthogonality, were completely novel and I had a hard time understanding them. The various metaphors, analogies, and parables that exist within the AI-risk discussion were of significant benefit to me in gaining the necessary intuitions to understand the problem.
Thank you for the perspective. I mostly agree with your points, but I still feel the abstract metaphors unnecessarily detract from the core arguments. While this style may help more people grasp that “this group”, represented by Yudkowsky, sees AGI as an existential threat, I believe the excessive use of metaphor weakens the clarity and force of the underlying reasoning.
To draw an analogy :-) it feels a bit like teaching Sunday school Christianity to adults: offering simplified narratives while obscuring the more nuanced structures behind the beliefs.It might serve as a useful entry point for some, but for many, it comes across as unrealistic and simply too far removed from their lived experience to be taken seriously. (For the record, I’m not a Christian.)
That said, I do see your point, and I may be mistaken in my intuition. My observation is anecdotal, and you’ve just provided a counterexample to the ones I’ve encountered.
His metaphors serve as useful intuition pumps for ideas that are highly abstract and far removed from what the average layperson has thought about prior. When I started reading about AI risk, many of the ideas I was introduced to, like optimization and orthogonality, were completely novel and I had a hard time understanding them. The various metaphors, analogies, and parables that exist within the AI-risk discussion were of significant benefit to me in gaining the necessary intuitions to understand the problem.
Thank you for the perspective. I mostly agree with your points, but I still feel the abstract metaphors unnecessarily detract from the core arguments. While this style may help more people grasp that “this group”, represented by Yudkowsky, sees AGI as an existential threat, I believe the excessive use of metaphor weakens the clarity and force of the underlying reasoning.
To draw an analogy :-) it feels a bit like teaching Sunday school Christianity to adults: offering simplified narratives while obscuring the more nuanced structures behind the beliefs.It might serve as a useful entry point for some, but for many, it comes across as unrealistic and simply too far removed from their lived experience to be taken seriously. (For the record, I’m not a Christian.)
That said, I do see your point, and I may be mistaken in my intuition. My observation is anecdotal, and you’ve just provided a counterexample to the ones I’ve encountered.