I haven’t followed the whole thing, because I couldn’t. How can I decide wether he is right or not. I don’t know what was censored, and why. Reading the thread on academic careers just had some big holes where, presumably, things were deleted, and I couldn’t reconstruct why.
Other forums have some kind of policy, where they explicitly say what kind of post will be censored. I’m not against censoring stuff, but knowing what is worthy of being censored and what isn’t would be nice.
With the knowledge I currently have about this whole thing, I still feel slightly sympathetic for WaitingForGoedel’s cause. The “Free Speech is important” heuristic that Vladimir Nesov mentioned in the other thread is pretty useful, in my opinion, and without knowing the reason for posts being deleted, I can’t decide for myself wether it made sense or not.
I intend to stick around, anyway, because I don’t feel very strongly about this issue, so I won’t frustrate anybody, I hope. But an answer would still be nice.
I do know what was censored and why, and I think Eliezer was wrong to delete the material in question.
That’s a separate issue from whether waitingforgodel’s method of expressing his (correct) disagreement with the censorship is sane or reasonable—of course it isn’t.
That’s a separate issue from whether waitingforgodel’s method of expressing his (correct) disagreement with the censorship is sane or reasonable—of course it isn’t.
Though, I can see a strong argument for “blow up whenever your rights are threatened,” especially if you expect that you will only be able to raise awareness, not effect change. It also means those of us who internalized the sequences have our evaporative cooling alarms triggering. Is disagreeing with the existence of Langford basilisks, and caring enough to make a stink about it instead of just scoff, really enough to show someone the door?
It’s true that the basilisk in question is a wild fantasy even by Singularitarian standards, and that people took it seriously enough to get upset about it, could well be considered cause for alarm.
But that’s not why people are telling waitingforgodel they’d rather he left. People are telling him that because he took action he sincerely (perhaps wrongly, but sincerely) believed would reduce humanity’s chances of survival. That’s a lot crazier than believing in basilisks!
And the pity is, it’s not true he couldn’t effect change. The right thing to do in a scenario like this is propose reasonable compromises (like the idea of rot13′ing posts on topics people find upsetting) and if those fail then, with the moral high ground under your feet, find or create an alternative site for discussion of the banned topics. Not only would that be morally better than this nutty blackmail scheme, it would also be more effective.
This is a great example of the general rule that if you think you need to do something crazy or evil for the greater good, you are probably wrong—keep looking for a better solution instead.
But that’s not why people are telling waitingforgodel they’d rather he left. People are telling him that because he took action he sincerely (perhaps wrongly, but sincerely) believed would reduce humanity’s chances of survival. That’s a lot crazier than believing in basilisks!
I am not entirely clear on the timeline- I haven’t researched his precommitment and whether or not EY saw it- but at some point EY commented in his Mod Voice that undeleting comments was subject to banning, and so that is the part where most people seem to agree that wfg went crazy.
So it’s not “wow, you’re murdering people to make a point?” that started people saying “maybe you ought not be here,” but it certainly is what made that idea catch on.
And the pity is, it’s not true he couldn’t effect change. The right thing to do in a scenario like this is propose reasonable compromises (like the idea of rot13′ing posts on topics people find upsetting) and if those fail then, with the moral high ground under your feet, find or create an alternative site for discussion of the banned topics. Not only would that be morally better than this nutty blackmail scheme, it would also be more effective.
I agree with the desirability of this hypothetical. I have no data on the probability of this hypothetical.
But that’s not why people are telling waitingforgodel they’d rather he left. People are telling him that because he took action he sincerely (perhaps wrongly, but sincerely) believed would reduce humanity’s chances of survival. That’s a lot crazier than believing in basilisks!
My main problem is just that he’s being a bit of a dick, and this is bad in social spaces.
Is disagreeing with the existence of Langford basilisks, and caring enough to make a stink about it instead of just scoff, really enough to show someone the door?
No. Threatening to kill 6790 people and then claiming to actually gone through with it, however, is.
Your math is wrong. It was always wrong, and it is even more wrong now that it is clear that you are failing to influence Eliezer’s behavior (for which I am thankful).
Reminder: Exposure to the basilisk can cause and has caused immediate severe mental torment to people with OCD or strong OCD tendencies. Again, this has already happened (at least two reports that I know of). So that’s like posting a video that gives vulnerable people epileptic fits, like that infamous Pokemon episode.
“Please remember”, he said in a dryly sarcastic voice, “that not everyone’s mind is an invincible fortress like yours.”
The appropriate way of dealing with this issue is by posting some kind of trigger warning or posting in rot13. And then recommending therapy. Not censorship.
What I want to know is why Eliezer is still advertising the fact that members of SIAI are psychologically incapable of even considering the kinds of issues that come along with thinking about singularities. I can kind of understanding him letting that slip in the heat of the moment while in the throes of his emotional outburst but why is still saying it now? Why wouldn’t he be trying to do whatever he can to convey that the important thing in his mind is the deep game theoretical issue that nobody else is sophisticated enough to understand?
Sure, even if someone at the SIAI has a disability in one area they could well make valuable contributions in another. But that doesn’t make it something to boast about publicly without taking care to emphasise that not everyone else is so crippled.
If you are vulnerable to epileptic fits don’t work in a pokemon factory—even if your factory only creates ‘good’ pokemon!
My interpretation (which Eliezer’s above comment seems to have confirmed) was, Eliezer deleted Roko’s comment for the exact same reason he would have deleted an epileptic-fit-inducing animation. Simply to protect some of the readers, many of whom might not even be aware of their own vulnerability, for this is not exacly a commonly triggered or recognized weakness.
I felt all the rest with ‘existential risk’ and ‘supressed ideas’ was just added by people in the absence of real information. Like, someone saw ‘existential risk’ near (in?) Roko’s comment and heard that Eliezer is worried about ‘existential risks’ so they concluded that must have been the reason the post was deleted. This sort of thing tends to happen, especially when they were already critical, such as timtyler, who was taking potshots at Eliezer and the SIAI even before Roko’s post was deleted (top 2 comments). (Yes, I mention timtyler because I know his opinion could have affected yours)
My big problem with this theory is that it requires you to have been making a basic mistake. Which is always suspect, since shown yourself a smart and competent poster. (That some other posters, such as WFG were foolish is a given, I’m afraid.) So the simplest way to resolve my confusion is to ask you directly, hence this comment.
Why do you dismiss the above interpretation? What do you see that I don’t?
Since you have already replied to the grandparent with a partial affirmation could you please confirm or (I hope) deny the primary contention of said comment?
My interpretation (which Eliezer’s above comment seems to have confirmed) was, Eliezer deleted Roko’s comment for the exact same reason he would have deleted an epileptic-fit-inducing animation.
That is another idiot ball which I have assumed you are not guilty of bearing. But if you are giving support to a comment which presents such an interpretation it warrants clarification.
Depends what you mean by “exact same”. I deleted the basilisk strictly to protect readers, yes. I didn’t realize at the time that there was also an immediate damage mode for unusually vulnerable readers.
I haven’t followed the whole thing, because I couldn’t. How can I decide wether he is right or not. I don’t know what was censored, and why. Reading the thread on academic careers just had some big holes where, presumably, things were deleted, and I couldn’t reconstruct why.
Other forums have some kind of policy, where they explicitly say what kind of post will be censored. I’m not against censoring stuff, but knowing what is worthy of being censored and what isn’t would be nice.
With the knowledge I currently have about this whole thing, I still feel slightly sympathetic for WaitingForGoedel’s cause. The “Free Speech is important” heuristic that Vladimir Nesov mentioned in the other thread is pretty useful, in my opinion, and without knowing the reason for posts being deleted, I can’t decide for myself wether it made sense or not.
I intend to stick around, anyway, because I don’t feel very strongly about this issue, so I won’t frustrate anybody, I hope. But an answer would still be nice.
I do know what was censored and why, and I think Eliezer was wrong to delete the material in question.
That’s a separate issue from whether waitingforgodel’s method of expressing his (correct) disagreement with the censorship is sane or reasonable—of course it isn’t.
Though, I can see a strong argument for “blow up whenever your rights are threatened,” especially if you expect that you will only be able to raise awareness, not effect change. It also means those of us who internalized the sequences have our evaporative cooling alarms triggering. Is disagreeing with the existence of Langford basilisks, and caring enough to make a stink about it instead of just scoff, really enough to show someone the door?
It’s true that the basilisk in question is a wild fantasy even by Singularitarian standards, and that people took it seriously enough to get upset about it, could well be considered cause for alarm.
But that’s not why people are telling waitingforgodel they’d rather he left. People are telling him that because he took action he sincerely (perhaps wrongly, but sincerely) believed would reduce humanity’s chances of survival. That’s a lot crazier than believing in basilisks!
And the pity is, it’s not true he couldn’t effect change. The right thing to do in a scenario like this is propose reasonable compromises (like the idea of rot13′ing posts on topics people find upsetting) and if those fail then, with the moral high ground under your feet, find or create an alternative site for discussion of the banned topics. Not only would that be morally better than this nutty blackmail scheme, it would also be more effective.
This is a great example of the general rule that if you think you need to do something crazy or evil for the greater good, you are probably wrong—keep looking for a better solution instead.
I am not entirely clear on the timeline- I haven’t researched his precommitment and whether or not EY saw it- but at some point EY commented in his Mod Voice that undeleting comments was subject to banning, and so that is the part where most people seem to agree that wfg went crazy.
So it’s not “wow, you’re murdering people to make a point?” that started people saying “maybe you ought not be here,” but it certainly is what made that idea catch on.
I agree with the desirability of this hypothetical. I have no data on the probability of this hypothetical.
No, WFG committed to that before I said anything in Mod Voice.
Clarification: I meant that his response to the Mod Voice comment was where he started losing supporters. (For example, here.)
My main problem is just that he’s being a bit of a dick, and this is bad in social spaces.
No. Threatening to kill 6790 people and then claiming to actually gone through with it, however, is.
By my math it’s an existential risk reduction. Your point was talked about already in the “precommitment” post linked to from this article.
Your math is wrong. It was always wrong, and it is even more wrong now that it is clear that you are failing to influence Eliezer’s behavior (for which I am thankful).
Why not share ‘the Basilisk’ with more people every time EY censors a post instead of raising existential risk?
Is this comment the forum’s first meta-basilisk?
Reminder: Exposure to the basilisk can cause and has caused immediate severe mental torment to people with OCD or strong OCD tendencies. Again, this has already happened (at least two reports that I know of). So that’s like posting a video that gives vulnerable people epileptic fits, like that infamous Pokemon episode.
“Please remember”, he said in a dryly sarcastic voice, “that not everyone’s mind is an invincible fortress like yours.”
The appropriate way of dealing with this issue is by posting some kind of trigger warning or posting in rot13. And then recommending therapy. Not censorship.
Sorry, I think this is lost on me, why did you post this in reply to my comment?
What I want to know is why Eliezer is still advertising the fact that members of SIAI are psychologically incapable of even considering the kinds of issues that come along with thinking about singularities. I can kind of understanding him letting that slip in the heat of the moment while in the throes of his emotional outburst but why is still saying it now? Why wouldn’t he be trying to do whatever he can to convey that the important thing in his mind is the deep game theoretical issue that nobody else is sophisticated enough to understand?
Sure, even if someone at the SIAI has a disability in one area they could well make valuable contributions in another. But that doesn’t make it something to boast about publicly without taking care to emphasise that not everyone else is so crippled.
If you are vulnerable to epileptic fits don’t work in a pokemon factory—even if your factory only creates ‘good’ pokemon!
wedrifid
My interpretation (which Eliezer’s above comment seems to have confirmed) was, Eliezer deleted Roko’s comment for the exact same reason he would have deleted an epileptic-fit-inducing animation. Simply to protect some of the readers, many of whom might not even be aware of their own vulnerability, for this is not exacly a commonly triggered or recognized weakness.
I felt all the rest with ‘existential risk’ and ‘supressed ideas’ was just added by people in the absence of real information. Like, someone saw ‘existential risk’ near (in?) Roko’s comment and heard that Eliezer is worried about ‘existential risks’ so they concluded that must have been the reason the post was deleted. This sort of thing tends to happen, especially when they were already critical, such as timtyler, who was taking potshots at Eliezer and the SIAI even before Roko’s post was deleted (top 2 comments). (Yes, I mention timtyler because I know his opinion could have affected yours)
My big problem with this theory is that it requires you to have been making a basic mistake. Which is always suspect, since shown yourself a smart and competent poster. (That some other posters, such as WFG were foolish is a given, I’m afraid.) So the simplest way to resolve my confusion is to ask you directly, hence this comment.
Why do you dismiss the above interpretation? What do you see that I don’t?
Yes, whole rafts of stuff are being made-up here.
Since you have already replied to the grandparent with a partial affirmation could you please confirm or (I hope) deny the primary contention of said comment?
That is another idiot ball which I have assumed you are not guilty of bearing. But if you are giving support to a comment which presents such an interpretation it warrants clarification.
Depends what you mean by “exact same”. I deleted the basilisk strictly to protect readers, yes. I didn’t realize at the time that there was also an immediate damage mode for unusually vulnerable readers.