I’m afraid the evolution analogy isn’t as convincing an argument for everyone as Eliezer seems to think. For me, for instance, it’s quite persuasive because evolution has long been a central part of my world model. However, I’m aware that for most “normal people”, this isn’t the case; evolution is a kind of dormant knowledge, not a part of the lens they see the world with. I think this is why they can’t intuitively grasp, like most rat and rat-adjacent people do, how powerful optimization processes (like gradient descent or evolution) can lead to mesa-optimization, and what the consequences of that might be: the inferential distance is simply too large.
I think Eliezer has made great strides recently in appealing to a broader audience. But if we want to convince more people, we need to find rhetorical tools other than the evolution analogy and assume less scientific intuition.
I’m afraid the evolution analogy isn’t as convincing an argument for everyone as Eliezer seems to think. For me, for instance, it’s quite persuasive because evolution has long been a central part of my world model. However, I’m aware that for most “normal people”, this isn’t the case; evolution is a kind of dormant knowledge, not a part of the lens they see the world with. I think this is why they can’t intuitively grasp, like most rat and rat-adjacent people do, how powerful optimization processes (like gradient descent or evolution) can lead to mesa-optimization, and what the consequences of that might be: the inferential distance is simply too large.
I think Eliezer has made great strides recently in appealing to a broader audience. But if we want to convince more people, we need to find rhetorical tools other than the evolution analogy and assume less scientific intuition.