Is the Iron law of oligarchy essentially a Goodhart’s Law applied to humans? Like: You want a group of humans to accomplish something useful, so you create a system to resolve conflicts, e.g. a democratic majority vote. Sooner or later people learn how to win the majority vote by optimizing for winning the majority vote, without accomplishing much of what you originally wanted them to do. -- And if you try to fix this by adding some safety mechanism X to the democratic vote, then people will simply optimize for the majority vote plus X. For example in addition to elected politicians known to optimize for popularity, you add unelected bureaucrats who are supposed to be the experts, but somehow those just entrench themselves in the bureaucratic system regardless of their level of expertise.
If so, then essentially there is no safe way to solve this. If we measure something, then Goodhart’s Law attacks. If we don’t measure something, then… well, just because you are not looking at something, it doesn’t mean it’s not there… in the absence of explicit rules, the implicit rules will decide; the most popular people will simply be the most popular people.
All we can do is to use are some heuristics, and remember the nameless virtue; i.e. to change or abandon the heuristics when they stop being reasonable. We must keep thinking and updating, and thinking and updating, again and again.
Specifically, I have already noticed the Goodhart’s Law in action on StackExchange. Instead of helping other people it’s more and more about getting more points than other contributors. For example, you start writing your answer before you even think about it completely, because posting the partial answer and editing it later is better than thinking about it, posting it, and finding that someone else posted their very similar answer 1 minute sooner. So it’s “use google, post the first information found, use google more, edit your answer to include the additional information” cycle. And if the question cannot be solved by googling, nominate it for deleting, as off-topic or something. If no one can answer the question fully, then criticize the partial answers of other people as incomplete and downvote them; if you couldn’t get points by using your strategy, they don’t deserve them either.
Improving measurements is one of the boring but massive levers we have at our disposal, e.g. givewell, the technical details of how voting schemes capture preferences etc.
Stackoverflow seems to succeed in growing despite the issues that Michael brings up. Yes it’s not the place it was three years ago but it’s okay that communities change.
This write-up: http://michael.richter.name/blogs/why-i-no-longer-contribute-to-stackoverflow/
And also one of the main issues he discusses:
http://en.wikipedia.org/wiki/Iron_law_of_oligarchy
Seem to be relevant to LessWrong as well, to some degree. How can we avoid the problem of ‘Creeping Authoritarianism’?
Is the Iron law of oligarchy essentially a Goodhart’s Law applied to humans? Like: You want a group of humans to accomplish something useful, so you create a system to resolve conflicts, e.g. a democratic majority vote. Sooner or later people learn how to win the majority vote by optimizing for winning the majority vote, without accomplishing much of what you originally wanted them to do. -- And if you try to fix this by adding some safety mechanism X to the democratic vote, then people will simply optimize for the majority vote plus X. For example in addition to elected politicians known to optimize for popularity, you add unelected bureaucrats who are supposed to be the experts, but somehow those just entrench themselves in the bureaucratic system regardless of their level of expertise.
If so, then essentially there is no safe way to solve this. If we measure something, then Goodhart’s Law attacks. If we don’t measure something, then… well, just because you are not looking at something, it doesn’t mean it’s not there… in the absence of explicit rules, the implicit rules will decide; the most popular people will simply be the most popular people.
All we can do is to use are some heuristics, and remember the nameless virtue; i.e. to change or abandon the heuristics when they stop being reasonable. We must keep thinking and updating, and thinking and updating, again and again.
Specifically, I have already noticed the Goodhart’s Law in action on StackExchange. Instead of helping other people it’s more and more about getting more points than other contributors. For example, you start writing your answer before you even think about it completely, because posting the partial answer and editing it later is better than thinking about it, posting it, and finding that someone else posted their very similar answer 1 minute sooner. So it’s “use google, post the first information found, use google more, edit your answer to include the additional information” cycle. And if the question cannot be solved by googling, nominate it for deleting, as off-topic or something. If no one can answer the question fully, then criticize the partial answers of other people as incomplete and downvote them; if you couldn’t get points by using your strategy, they don’t deserve them either.
Improving measurements is one of the boring but massive levers we have at our disposal, e.g. givewell, the technical details of how voting schemes capture preferences etc.
Stackoverflow seems to succeed in growing despite the issues that Michael brings up. Yes it’s not the place it was three years ago but it’s okay that communities change.