It’s hard to polish a turd. And I think all the people who have responded by saying that Eliezer’s PR needs to be better are suggesting that he polish a turd. The basilisk and the way the basilisk was treated has implications about LW that are inherently negative, to the point where no amount of PR can fix it. The only way to fix it is for LW to treat the Basilisk differently.
I think that if Eliezer were to
Allow free discussion of the basilisk and
Deny that the basilisk or anything like it could actually put one in danger from advanced future intelligences,
people would stop seeing the basilisk as reflecting badly on LW. It might take some time to fade, but it would eventually go away. But Eliezer can’t do that, because he does think that basilisk-like ideas can be dangerous, and this belief of his is feeding his inability to really deny the Basilisk.
And (3) explain why other potential info hazards, not the basilisk but very different configurations of acausal negotation (that have either not yet discovered, or were discovered but they not made public), should not be discussed.
The basilisk and the way the basilisk was treated has implications about LW that are inherently negative, to the point where no amount of PR can fix it.
This is true; nevertheless, good PR should still make things as least bad as possible. And indeed, you go on to make a suggestion as to how to do that (not even a bad one in my opinion).
But Eliezer can’t do that, because he does think that basilisk-like ideas can be dangerous, and this belief of his is feeding his inability to really deny the Basilisk.
In other words, he disagrees with you and that is preventing him from agreeing with you.
Yes, except that agreeing with me is what a lot of people take Eliezer to be saying. There’s this widespread belief that Eliezer just denied the Basilisk. And that’s not really true; he denied the exact version of the Basilisk that was causing trouble, but he acceps the Basilisk in principle.
Not especially. This post is still here because I’m feeling too lethargic to delete it, but the /r/xkcd moderator deleted most of the basilisk discussion on their recent thread because it violated their Rule 3, “Be Nice”. This is a fine upstanding policy, and I fully agree with it. If there’s one thing we can deduce about the motives of future superintelligences, it’s that they simulate people who talk about Roko’s Basilisk and condemn them to an eternity of forum posts about Roko’s Basilisk. So far as official policy goes, go talk about it somewhere else. But in this special case I won’t ban any RB discussion such that /r/xkcd would allow it to occur there. Sounds fair to me.
Typical low-moderation problems. Repeated discussions of contentious but played-out issues like religion, IQ, status of various fields, etc. The basilisk is an infohazard in that sense at this point, IMO. It’s fun to argue about, to the point of displacing other worthwhile discussion.
Eliezer has denied that the exact Basilisk scenario is a danger, but not that anything like it can be a danger. He seems to think that discussing acausal trade with future AIs can be dangerous enough that we shouldn’t talk about the details.
It’s hard to polish a turd. And I think all the people who have responded by saying that Eliezer’s PR needs to be better are suggesting that he polish a turd. The basilisk and the way the basilisk was treated has implications about LW that are inherently negative, to the point where no amount of PR can fix it. The only way to fix it is for LW to treat the Basilisk differently.
I think that if Eliezer were to
Allow free discussion of the basilisk and
Deny that the basilisk or anything like it could actually put one in danger from advanced future intelligences,
people would stop seeing the basilisk as reflecting badly on LW. It might take some time to fade, but it would eventually go away. But Eliezer can’t do that, because he does think that basilisk-like ideas can be dangerous, and this belief of his is feeding his inability to really deny the Basilisk.
And (3) explain why other potential info hazards, not the basilisk but very different configurations of acausal negotation (that have either not yet discovered, or were discovered but they not made public), should not be discussed.
This is true; nevertheless, good PR should still make things as least bad as possible. And indeed, you go on to make a suggestion as to how to do that (not even a bad one in my opinion).
In other words, he disagrees with you and that is preventing him from agreeing with you.
Yes, except that agreeing with me is what a lot of people take Eliezer to be saying. There’s this widespread belief that Eliezer just denied the Basilisk. And that’s not really true; he denied the exact version of the Basilisk that was causing trouble, but he acceps the Basilisk in principle.
Eliezer has done (2) many times.
Doing 2 without doing 1 looks insincere.
This post is still here, isn’t it?
If I remember right, earlier this year a few posts did disappear.
I’m also not aware of any explicit withdrawal of the previous policy.
We conclude that free discussion is now allowed, so maybe all that’s really missing is putting that up explicitly somewhere that can be linked to?
Not especially. This post is still here because I’m feeling too lethargic to delete it, but the /r/xkcd moderator deleted most of the basilisk discussion on their recent thread because it violated their Rule 3, “Be Nice”. This is a fine upstanding policy, and I fully agree with it. If there’s one thing we can deduce about the motives of future superintelligences, it’s that they simulate people who talk about Roko’s Basilisk and condemn them to an eternity of forum posts about Roko’s Basilisk. So far as official policy goes, go talk about it somewhere else. But in this special case I won’t ban any RB discussion such that /r/xkcd would allow it to occur there. Sounds fair to me.
?
Are you implying that the basilisk discussion is somehow censored on this forum?
It doesn’t appear to be censored in this thread, but it was historically censored on LessWrong. Maybe EY finally understood the Streisand effect.
He might do it less for the “danger” and more for “bad discussion”. The threads I see on /sci/ raising questions about high IQ come to mind.
Well, most threads I see on /sci/ come to mind.
I don’t read /sci/ therefore I don’t understand what you mean.
Do you know of it?
No, I’ve just found out that it is a board on 4chan.
Typical low-moderation problems. Repeated discussions of contentious but played-out issues like religion, IQ, status of various fields, etc. The basilisk is an infohazard in that sense at this point, IMO. It’s fun to argue about, to the point of displacing other worthwhile discussion.
LessWrong also has low moderation. Why would the basilisk generate more trivial discussion than other topics?
Eliezer has denied that the exact Basilisk scenario is a danger, but not that anything like it can be a danger. He seems to think that discussing acausal trade with future AIs can be dangerous enough that we shouldn’t talk about the details.