Sometimes, success is the first step towards a specific kind of failure.
I heard that the most difficult moment for a company is the moment it starts making decent money. Until then, the partners shared a common dream and worked together against the rest of the world. Suddenly, the profit is getting close to one million, and each partner becomes aware that he made the most important contributions, while the others did less critical things which technically could be done by employees, so having to share the whole million with them equally is completely stupid. At this moment the company often falls apart.
When a group of people becomes very successful, fighting against other people within the group can bring higher profit than cooperating against the environment. It is like playing a variant of a Prisonner’s Dilemma where the game ends at the first defection and the rewards for defection are growing each turn. It’s only semi-iterated; if you cooperate, you can continue to cooperate in the next turn, but if you manage to defect successfully, there may be no revenge, because the other person will be out.
Will something like this happen to the rationalist community one day (assuming the Singularity will not happen soon)? At this moment, there are small islands of sanity in the vast oceans of irrationality. But what if some day LW-style rationality becomes popular? What are the risks of success analogical to a successful company falling apart?
I can imagine that many charismatic leaders will try to become known as the most rational individual on the planet. (If rationality becomes 1000× more popular than it is today, imagine the possible temptations: people sending you millions of dollars to support your mission, hundreds of willing attractive poly partners, millions of fans...) There will be honest competition, which is good, but there will also be backstabbing. Some groups will experiment with mixing 99% rationality and 1% applause lights (or maybe 90% rationality and 10% applause lights), where “applause lights” will be different for different groups; it could be religion, marxism, feminism, libertarianism, racism, whatever. Or perhaps just removing the controversial parts, starting with many-worlds interpretation. Groups which optimize for popularity could spread faster; the question is how quickly would they diverge from rationality.
Do you think an outcome like this is likely? Do you think it is good or bad? (Maybe it is better to have million people with 90% of rationality, than only a thousand with 99% of rationality.) When will it happen? How could we prevent it?
True. It’s harder to fake rationality than it is to fake the things that matter today, however (say, piety). And given that the sanity waterline has increased enough that “rational” is one of the most desirable traits for somebody to have, fake signaling should be much harder to execute. (Somebody who views rationality as such a positive trait is likely to be trying to hone their own rationality skills, after all, and should be harder to fool than the same person without any such respect for rationality or desire to improve their own.)
Here I assume that the popularity of the word “rationality” will come before there are millions of x-rationalists to provide feedback against wannabe rationalists. It would be enough if some political movement decided to use this word as their applause light.
The community here is heavily centered around Eliezer. I guess if someone started promoting some kind of fake rationality here, sooner or later they would get into conflict with Eliezer, and then most likely lose the support of the community.
For another wannabe rationalist guru it would be better to start their own website, not interact with people on LW, but start recruiting somewhere else, until they have greater user base than LW. At the moment their users notice LW, all they have to do is: 1) publish a few articles about cults and mindkilling, to prime their readers, and 2) publish a critique of LW with hyperlinks to all currently existing critical sources. The proper framing would be that LW is a fringe group which uses “rationality” as applause lights, but fails horribly (insert a lot of quotations and hyperlinks here), and discussing them is really low-status.
It would help if the new rationalist website had a more professional design, and emphasised its compatibility with mainstream science, e.g. by linking to high-status scientific institutions, and sometimes writing completely uncontroversial articles about what those institutions do. In other words, the new website should be optimized to get 100% approval of the RationalWiki community. (For someone trying to do this, becoming a trusted member of RationalWiki community could be a good starting point.)
I’m busy having pretty much every function of RW come my way, in a Ponder Stibbons-like manner, so if you can tell me where the money is in this I’ll see what I can come up with. (So far I’ve started a blog with no ads. This may not be the way to fame and fortune.)
The money or lack thereof doesn’t matter, since RW is obviously not an implementation of Villam’s proposed strategy: it fails on the ugliness with its stock MediaWiki appearance, has too broad a remit, and like El Reg it shoots itself in the foot with its oh-so-hilarious-not! sense of humor (I dislike reading it even on pages completely unrelated to LW). It may be successful in its niche, but its niche is essentially the same niche as /r/atheism or Richard Dawkins—mockery of the enemy leavened with some facts and references.
If—purely hypothetically speaking here, of course—one wished to discredit LW by making the respective RW article as negative as possible, I would expect it to do real damage. But not be any sort of fatal takedown that set a mainstream tone or gave a general population its marching orders, along the lines of Shermer’s ‘cryonics is a scam because frozen strawberries’ or Gould’s Mismeasure of Man’s ‘IQ is racist, involved researchers like Merton faked the data because they are racist, and it caused the Holocaust too’.
It would help if the new rationalist website had a more professional design, and emphasised its compatibility with mainstream science, e.g. by linking to high-status scientific institutions, and sometimes writing completely uncontroversial articles about what those institutions do. In other words, the new website should be optimized to get 100% approval of the RationalWiki community. (For someone trying to do this, becoming a trusted member of RationalWiki community could be a good starting point.)
I can’t see any good general solutions. People are limited to their own judgement about whether something which purports to be selling rationality actually makes sense.
You take your chances with whether martial arts and yoga classes are useful and safe.
LW et al. does have first mover advantage and hopefully some prestige as a result, and I’m hoping that that resources for the general public will be developed here. On the other hand, taking sufficient care to develop workshops which actually work takes time—and that’s workshops for people whose intelligence level is similar to that of the people putting on the workshops.
If we assume that rationalists should win, even over fake rationalists, then maybe we should leave the possibility open that rationalists who are actually in the situation of competing with fake rationalists should be in a better position to find solutions because they’ll know more than we do now.
I also don’t have a solution besides reminding the rationalists that we run on corrupted hardware, and the strong feeling of “these people around me are idiots, I could do it hundred times better” is an evolutionary adaptation for situations when there are many resources and no significant external enemy. (And by the way, this could explain a lot of individualism our society has these days.) We had a few people here who got offended e.g. by Eliezer’s certainty about quantum physics, and tried to split, and failed.
So perhaps the risk is actually small. Fake rationalists may be prone to self-sabotage. The proverbial valley of the bad rationality surrounding the castle of rationality can make being a half-rationalist even worse than being a non-rationalist. So the rationalists may have a hard time fighting pure superstition, but the half-rationalists will just conveniently destroy themselves.
The first mover advantage works best if all players are using the same strategy. But sometimes the new player can learn from older players’ mistakes, and does not have to pay the costs. (Google wasn’t the first search engine; Facebook wasn’t the first social network; MS Windows wasn’t the first operating system with graphical interface.) The second player could learn from LW’s bad PR. But it is likely that being completely irrational would be even more profitable for them, if profit would be the main goal.
Sometimes, success is the first step towards a specific kind of failure.
I heard that the most difficult moment for a company is the moment it starts making decent money. Until then, the partners shared a common dream and worked together against the rest of the world. Suddenly, the profit is getting close to one million, and each partner becomes aware that he made the most important contributions, while the others did less critical things which technically could be done by employees, so having to share the whole million with them equally is completely stupid. At this moment the company often falls apart.
When a group of people becomes very successful, fighting against other people within the group can bring higher profit than cooperating against the environment. It is like playing a variant of a Prisonner’s Dilemma where the game ends at the first defection and the rewards for defection are growing each turn. It’s only semi-iterated; if you cooperate, you can continue to cooperate in the next turn, but if you manage to defect successfully, there may be no revenge, because the other person will be out.
Will something like this happen to the rationalist community one day (assuming the Singularity will not happen soon)? At this moment, there are small islands of sanity in the vast oceans of irrationality. But what if some day LW-style rationality becomes popular? What are the risks of success analogical to a successful company falling apart?
I can imagine that many charismatic leaders will try to become known as the most rational individual on the planet. (If rationality becomes 1000× more popular than it is today, imagine the possible temptations: people sending you millions of dollars to support your mission, hundreds of willing attractive poly partners, millions of fans...) There will be honest competition, which is good, but there will also be backstabbing. Some groups will experiment with mixing 99% rationality and 1% applause lights (or maybe 90% rationality and 10% applause lights), where “applause lights” will be different for different groups; it could be religion, marxism, feminism, libertarianism, racism, whatever. Or perhaps just removing the controversial parts, starting with many-worlds interpretation. Groups which optimize for popularity could spread faster; the question is how quickly would they diverge from rationality.
Do you think an outcome like this is likely? Do you think it is good or bad? (Maybe it is better to have million people with 90% of rationality, than only a thousand with 99% of rationality.) When will it happen? How could we prevent it?
People competing to be known as the most rational?
Er… what’s the downside again?
It’s much easier to signal rationality than to actually be rational.
True. It’s harder to fake rationality than it is to fake the things that matter today, however (say, piety). And given that the sanity waterline has increased enough that “rational” is one of the most desirable traits for somebody to have, fake signaling should be much harder to execute. (Somebody who views rationality as such a positive trait is likely to be trying to hone their own rationality skills, after all, and should be harder to fool than the same person without any such respect for rationality or desire to improve their own.)
Faking rationality would be rather easy: Criticize everything which is not generally accepted and always find biases in people you disagree with (and since they are humans, you always find some). When “rationality” becomes a popular word, you can get many followers by doing this.
Here I assume that the popularity of the word “rationality” will come before there are millions of x-rationalists to provide feedback against wannabe rationalists. It would be enough if some political movement decided to use this word as their applause light.
Do you see any popular people here you’d describe as faking rationality? Do we seem to have good detectors for such behavior?
We’re a pretty good test case for whether this is viable or not, after all. (Less so for somebody co-opting words, granted...)
The community here is heavily centered around Eliezer. I guess if someone started promoting some kind of fake rationality here, sooner or later they would get into conflict with Eliezer, and then most likely lose the support of the community.
For another wannabe rationalist guru it would be better to start their own website, not interact with people on LW, but start recruiting somewhere else, until they have greater user base than LW. At the moment their users notice LW, all they have to do is: 1) publish a few articles about cults and mindkilling, to prime their readers, and 2) publish a critique of LW with hyperlinks to all currently existing critical sources. The proper framing would be that LW is a fringe group which uses “rationality” as applause lights, but fails horribly (insert a lot of quotations and hyperlinks here), and discussing them is really low-status.
It would help if the new rationalist website had a more professional design, and emphasised its compatibility with mainstream science, e.g. by linking to high-status scientific institutions, and sometimes writing completely uncontroversial articles about what those institutions do. In other words, the new website should be optimized to get 100% approval of the RationalWiki community. (For someone trying to do this, becoming a trusted member of RationalWiki community could be a good starting point.)
I’m busy having pretty much every function of RW come my way, in a Ponder Stibbons-like manner, so if you can tell me where the money is in this I’ll see what I can come up with. (So far I’ve started a blog with no ads. This may not be the way to fame and fortune.)
The money or lack thereof doesn’t matter, since RW is obviously not an implementation of Villam’s proposed strategy: it fails on the ugliness with its stock MediaWiki appearance, has too broad a remit, and like El Reg it shoots itself in the foot with its oh-so-hilarious-not! sense of humor (I dislike reading it even on pages completely unrelated to LW). It may be successful in its niche, but its niche is essentially the same niche as /r/atheism or Richard Dawkins—mockery of the enemy leavened with some facts and references.
If—purely hypothetically speaking here, of course—one wished to discredit LW by making the respective RW article as negative as possible, I would expect it to do real damage. But not be any sort of fatal takedown that set a mainstream tone or gave a general population its marching orders, along the lines of Shermer’s ‘cryonics is a scam because frozen strawberries’ or Gould’s Mismeasure of Man’s ‘IQ is racist, involved researchers like Merton faked the data because they are racist, and it caused the Holocaust too’.
So … RationalWiki, then.
Accomplishment is a start. Do the claims match the observable results?
Yeah, because true rationality is going to be supporting something like cryonics that you personally believe in.
I can’t see any good general solutions. People are limited to their own judgement about whether something which purports to be selling rationality actually makes sense.
You take your chances with whether martial arts and yoga classes are useful and safe.
LW et al. does have first mover advantage and hopefully some prestige as a result, and I’m hoping that that resources for the general public will be developed here. On the other hand, taking sufficient care to develop workshops which actually work takes time—and that’s workshops for people whose intelligence level is similar to that of the people putting on the workshops.
If we assume that rationalists should win, even over fake rationalists, then maybe we should leave the possibility open that rationalists who are actually in the situation of competing with fake rationalists should be in a better position to find solutions because they’ll know more than we do now.
I also don’t have a solution besides reminding the rationalists that we run on corrupted hardware, and the strong feeling of “these people around me are idiots, I could do it hundred times better” is an evolutionary adaptation for situations when there are many resources and no significant external enemy. (And by the way, this could explain a lot of individualism our society has these days.) We had a few people here who got offended e.g. by Eliezer’s certainty about quantum physics, and tried to split, and failed.
So perhaps the risk is actually small. Fake rationalists may be prone to self-sabotage. The proverbial valley of the bad rationality surrounding the castle of rationality can make being a half-rationalist even worse than being a non-rationalist. So the rationalists may have a hard time fighting pure superstition, but the half-rationalists will just conveniently destroy themselves.
The first mover advantage works best if all players are using the same strategy. But sometimes the new player can learn from older players’ mistakes, and does not have to pay the costs. (Google wasn’t the first search engine; Facebook wasn’t the first social network; MS Windows wasn’t the first operating system with graphical interface.) The second player could learn from LW’s bad PR. But it is likely that being completely irrational would be even more profitable for them, if profit would be the main goal.