Uhm, it’s seriously egregious and needlessly harmful to suggest that SIAI supporters should maybe be engaging in terrorism. Seriously. I agree with Yvain. The example is poor and meant to be inflammatory, not to facilitate reasonable debate about what you think utilitarianism means.
Would you please rewrite it with a different example so this doesn’t just dissolve into a meaningless debate about x-risk x-rationality where half of your audience is already offended at what they believe to be a bad example and a flawed understanding of utilitarianism?
I should make it explicit that the original post didn’t advocate terrorism in any way but was a hypothetical reductio ad absurdum against utilitarianism that was obviously meant for philosophical consideration only.
It was nothing as simple as a philosophical argument against anything.
It is a line of reasoning working from premises that seem to be widely held, that I am unsure of how to integrate into my world view in a way that I (or most people?) would be comfortable with.
I don’t believe that you are honest in what you write here. If you would really vote against the bombing of Skynet before it tiles the universe with paperclips, then I don’t think you actually believe most of what is written on LW.
Terrorism is just a word to discredit acts that are deemed bad by those that oppose it.
If I was really sure that Al Qaeda was going to set free some superbug bioweapon stored in a school and there was no way to stop them doing so and kill millions then I would advocate using incendiary bombs on the school to destroy the weapons. I accept the position that even killing one person can’t be a mean to an end to save the whole world, but I don’t see how that fits with what is believed in this community. See Torture vs. Dust Specks (The obvious answer is TORTURE, Robin Hanson).
I’ll go ahead and reveal my answer now: Robin Hanson was correct, I do think that TORTURE is the obvious option, and I think the main instinct behind SPECKS is scope insensitivity. -- Eliezer Yudkowsky
Hush, hush! Of course I know it is bad to talk about it in this way. Same with what Roko wrote. The amount of things we shouldn’t talk about, even though they are completely rational, seems to be rising. I just don’t have the list of forbidden topics at hand right now.
I don’t think this is a solution. You better come up with some story why you people don’t think killing is wrong to prevent Skynet, because the idea of AI going FOOM is getting mainstream quickly and people will draw this conclusion and act upon it. Or you stand to what you believe and try to explain why it wouldn’t be terrorism but a far-seeing act to slow down AI research or at least watch over it and take out any dangerous research before FAI isn’t guaranteed.
Somehow I doubt there will be much discussion, high quality or low :) It seems like it has gone below the threshold to be seen in the discussion section. It is −3 in case you are wondering.
Uhm, it’s seriously egregious and needlessly harmful to suggest that SIAI supporters should maybe be engaging in terrorism. Seriously. I agree with Yvain. The example is poor and meant to be inflammatory, not to facilitate reasonable debate about what you think utilitarianism means.
Would you please rewrite it with a different example so this doesn’t just dissolve into a meaningless debate about x-risk x-rationality where half of your audience is already offended at what they believe to be a bad example and a flawed understanding of utilitarianism?
A lot of the comments on this post were really confusing until I got to this one.
I should make it explicit that the original post didn’t advocate terrorism in any way but was a hypothetical reductio ad absurdum against utilitarianism that was obviously meant for philosophical consideration only.
It was nothing as simple as a philosophical argument against anything.
It is a line of reasoning working from premises that seem to be widely held, that I am unsure of how to integrate into my world view in a way that I (or most people?) would be comfortable with.
I don’t believe that you are honest in what you write here. If you would really vote against the bombing of Skynet before it tiles the universe with paperclips, then I don’t think you actually believe most of what is written on LW.
Terrorism is just a word to discredit acts that are deemed bad by those that oppose it.
If I was really sure that Al Qaeda was going to set free some superbug bioweapon stored in a school and there was no way to stop them doing so and kill millions then I would advocate using incendiary bombs on the school to destroy the weapons. I accept the position that even killing one person can’t be a mean to an end to save the whole world, but I don’t see how that fits with what is believed in this community. See Torture vs. Dust Specks (The obvious answer is TORTURE, Robin Hanson).
You missed the point. He said it was bad to talk about, not that he agreed or disagreed with any particular statement.
Hush, hush! Of course I know it is bad to talk about it in this way. Same with what Roko wrote. The amount of things we shouldn’t talk about, even though they are completely rational, seems to be rising. I just don’t have the list of forbidden topics at hand right now.
I don’t think this is a solution. You better come up with some story why you people don’t think killing is wrong to prevent Skynet, because the idea of AI going FOOM is getting mainstream quickly and people will draw this conclusion and act upon it. Or you stand to what you believe and try to explain why it wouldn’t be terrorism but a far-seeing act to slow down AI research or at least watch over it and take out any dangerous research before FAI isn’t guaranteed.
Done. The numbers don’t really make sense in this version though.…
Thanks. The slightly less sensible numbers might deaden the point of your argument a little bit, but I think the quality of discussion will be higher.
Somehow I doubt there will be much discussion, high quality or low :) It seems like it has gone below the threshold to be seen in the discussion section. It is −3 in case you are wondering.