So at last, I’m ready to explain what I think the broader nerd failure mode here is: they have a tendency to notice that people are failing to behave optimally and then propose, as a solution to this problem, that people switch to behaving optimally.
Mild objection: It depends on who is being addressed. If I’m addressing you, it makes sense to simply suggest that you behavior more optimally. When I am telling you how to change the behavior of others, it’s a different story. Eliezer’s post is a direct appeal to the reader to change their voting strategy, not an instruction manual on how to change a nation’s voting structure.
1st order: Act (behave optimally yourself.)
2nd order: Influence the Acts of others: (“Please behave optimally! You get candy if you do.”)
3rd order: Alter prevailing methods of influencing the Acts of others (“This is how you lay out the incentive structure. This is the message you broadcast.”)
What you call “nerd failure mode” was simply a 2nd order action. What you are doing now is a 3rd order action. It doesn’t make sense to conceive of them in opposition (unless you think E.Y.’s solution of explaining his point of view was unhelpful in achieving his end, taking into account how little it cost).
In this specific case, if you think strategic voting is sub-optimal, you can’t go up to a voter who is voting strategically and tell them “Hey, let’s figure out why people vote strategically so we can change it” because you haven’t yet convinced them of the premise that voting strategically is bad in the first place. A 2nt order strategy is necessary in this scenario.
So if this is in fact a failure mode, then the 3rd order phrasing of this “failure mode” is “Attempt to convince people, using reason, that there is a more optimal way to behave”.
But I don’t think that you are claiming that this is ineffective, right? Instead I gather that your point was more that strategic voting really isn’t the problem here. But if that’s the issue, then the failure wasn’t “believing that telling people to be optimal works”—rather, the failure is “being mistaken about the cause of nincompoops in office”. Which is a rather different sort of failure.
I admit I may not have phrased that the best way, I was going out of my way to make the initial description of the nerd failure mode sound superficially reasonable. While doing so, I tried to hint at a cluster of subtle mistakes without spelling them out. You write:
If I’m addressing you, it makes sense to simply suggest that you behavior more optimally.
That might be true if you’re addressing me, but not true if you’re addressing someone else. One issue is that what makes sense for talking to rationalists who know how to take the kind of advice you’re giving, or are merely a short inferential distance from you, may not make sense for talking to the general public.
What I was really getting at, though, was the mistake of not not realizing what the hard part of the problem is, not even asking yourself that question, and acting as if noticing sub-optimal behavior was the hard part. But since reversed stupidity is not intelligence, realizing something is sub-optimal is often not enough to identify a better alternative. And other times, identifying the better alternative is easy—so easy, in fact, that the only reason there is a problem at all is because of the difficulty of getting people to follow it.
Mild objection: It depends on who is being addressed. If I’m addressing you, it makes sense to simply suggest that you behavior more optimally. When I am telling you how to change the behavior of others, it’s a different story. Eliezer’s post is a direct appeal to the reader to change their voting strategy, not an instruction manual on how to change a nation’s voting structure.
1st order: Act (behave optimally yourself.)
2nd order: Influence the Acts of others: (“Please behave optimally! You get candy if you do.”)
3rd order: Alter prevailing methods of influencing the Acts of others (“This is how you lay out the incentive structure. This is the message you broadcast.”)
What you call “nerd failure mode” was simply a 2nd order action. What you are doing now is a 3rd order action. It doesn’t make sense to conceive of them in opposition (unless you think E.Y.’s solution of explaining his point of view was unhelpful in achieving his end, taking into account how little it cost).
In this specific case, if you think strategic voting is sub-optimal, you can’t go up to a voter who is voting strategically and tell them “Hey, let’s figure out why people vote strategically so we can change it” because you haven’t yet convinced them of the premise that voting strategically is bad in the first place. A 2nt order strategy is necessary in this scenario.
So if this is in fact a failure mode, then the 3rd order phrasing of this “failure mode” is “Attempt to convince people, using reason, that there is a more optimal way to behave”.
But I don’t think that you are claiming that this is ineffective, right? Instead I gather that your point was more that strategic voting really isn’t the problem here. But if that’s the issue, then the failure wasn’t “believing that telling people to be optimal works”—rather, the failure is “being mistaken about the cause of nincompoops in office”. Which is a rather different sort of failure.
I admit I may not have phrased that the best way, I was going out of my way to make the initial description of the nerd failure mode sound superficially reasonable. While doing so, I tried to hint at a cluster of subtle mistakes without spelling them out. You write:
That might be true if you’re addressing me, but not true if you’re addressing someone else. One issue is that what makes sense for talking to rationalists who know how to take the kind of advice you’re giving, or are merely a short inferential distance from you, may not make sense for talking to the general public.
What I was really getting at, though, was the mistake of not not realizing what the hard part of the problem is, not even asking yourself that question, and acting as if noticing sub-optimal behavior was the hard part. But since reversed stupidity is not intelligence, realizing something is sub-optimal is often not enough to identify a better alternative. And other times, identifying the better alternative is easy—so easy, in fact, that the only reason there is a problem at all is because of the difficulty of getting people to follow it.