The reputational damage to Less Wrong has been done. Is there really anything to be gained by flipping moderation policy?
There’s now the impression that a community of aspiring rationalists — or, at least, its de-facto leaders — are experiencing an ongoing lack of clue on the subject of the efficacy of censorship on online PR.
The “reputational damage” is not just “Eliezer or LW have this kooky idea.”
It is ”… and they think there is something to be gained by shutting down discussion of this kooky idea, when others’ experience (Streisand Effect, DeCSS, etc.) and their own (this very thread) are strong evidence to the contrary.”
It is the apparent failure to update — or to engage with widely-recognized reality at all — that is the larger reputational damage.
It is, for that matter, the apparent failure to realize that saying “Don’t talk about this because it is bad PR” is itself horrible PR.
The idea that LW or its leadership dedicate nontrivial attention to encircling and defending against this kooky idea makes it appear that the idea is central to LW. Some folks on the thread on Stross’s forum seem to think that Roko discovered the hidden secret motivating MIRI! That’s bogus … but there’s a whole trope of “cults” suppressing knowledge of their secret teachings; someone who’s pattern-matched LW or transhumanism onto “cult” will predictably jump right there.
At this point, let’s not taunt people with the right kind of mental pathology to be made very uncomfortable by the basilisk or meta-set of basilisks.
My own take on the whole subject is that basilisk-fear is a humongous case of privileging the hypothesis coupled to an anxiety loop. But … I’m rather prone to anxiety loops myself, albeit over matters a little more personal and less abstract. The reason not to poke people with Roko’s basilisk is that doing so a form of aggression — it makes (some) people unhappy.
But as far as I can tell, it’s no worse in that regard than a typical Iain M. Banks novel, or some of Stross’s own ideas for that matter … which are considered entertainment. Which means … humans eat “basilisks” like this for dessert. In one of Banks’s novels, multiple galactic civilizations invent uploading, and use it to implement their religions’ visions of Hell, to punish the dead and provide an incentive to the living to conform to moral standards.
(But then, I read Stross and Banks. I don’t watch gore-filled horror movies, though, and I would consider someone forcing me to watch such a movie would be doing aggression against me. So I empathize with those who are actually distressed by the basilisk idea, or the “basilisk” idea for that matter.)
I have to say, I find myself feeling worse for Eliezer than for anyone else in this whole affair. Whatever else may be going on here, having one’s work cruelly mischaracterized and held up to ridicule is a whole bunch of no fun.
having one’s work cruelly mischaracterized and held up to ridicule is a whole bunch of no fun.
Thank you for appreciating this. I expected it before I got started on my life, I’m already accustomed to it by now, I’m sure it doesn’t compare to the pain of starving to death. Since I’m not in any real trouble, I don’t intend to angst about it.
There’s now the impression that a community of aspiring rationalists — or, at least, its de-facto leaders — are experiencing an ongoing lack of clue on the subject of the efficacy of censorship on online PR.
The “reputational damage” is not just “Eliezer or LW have this kooky idea.”
It is ”… and they think there is something to be gained by shutting down discussion of this kooky idea, when others’ experience (Streisand Effect, DeCSS, etc.) and their own (this very thread) are strong evidence to the contrary.”
It is the apparent failure to update — or to engage with widely-recognized reality at all — that is the larger reputational damage.
It is, for that matter, the apparent failure to realize that saying “Don’t talk about this because it is bad PR” is itself horrible PR.
The idea that LW or its leadership dedicate nontrivial attention to encircling and defending against this kooky idea makes it appear that the idea is central to LW. Some folks on the thread on Stross’s forum seem to think that Roko discovered the hidden secret motivating MIRI! That’s bogus … but there’s a whole trope of “cults” suppressing knowledge of their secret teachings; someone who’s pattern-matched LW or transhumanism onto “cult” will predictably jump right there.
My own take on the whole subject is that basilisk-fear is a humongous case of privileging the hypothesis coupled to an anxiety loop. But … I’m rather prone to anxiety loops myself, albeit over matters a little more personal and less abstract. The reason not to poke people with Roko’s basilisk is that doing so a form of aggression — it makes (some) people unhappy.
But as far as I can tell, it’s no worse in that regard than a typical Iain M. Banks novel, or some of Stross’s own ideas for that matter … which are considered entertainment. Which means … humans eat “basilisks” like this for dessert. In one of Banks’s novels, multiple galactic civilizations invent uploading, and use it to implement their religions’ visions of Hell, to punish the dead and provide an incentive to the living to conform to moral standards.
(But then, I read Stross and Banks. I don’t watch gore-filled horror movies, though, and I would consider someone forcing me to watch such a movie would be doing aggression against me. So I empathize with those who are actually distressed by the basilisk idea, or the “basilisk” idea for that matter.)
I have to say, I find myself feeling worse for Eliezer than for anyone else in this whole affair. Whatever else may be going on here, having one’s work cruelly mischaracterized and held up to ridicule is a whole bunch of no fun.
Thank you for appreciating this. I expected it before I got started on my life, I’m already accustomed to it by now, I’m sure it doesn’t compare to the pain of starving to death. Since I’m not in any real trouble, I don’t intend to angst about it.
Glad to hear it.