That bit reads to me as just a heading of one section of the article—a paragraph later it lays out the argument which is described as being “pretty air-tight”. Which argument does assume one has a particular kind of ethical system, but that’s not really the same thing as making the confusion you describe, especially when it’s an ethical system shared and trumpeted by many in the community.
Which argument does assume one has a particular kind of ethical system, but that’s not really the same thing as making the confusion you describe
Under this logic I can easily say “the intellectual case for killing infidels is pretty air-tight” or “the intellectual case for torturing suspects is pretty air-tight” because hey, we abstracted the values away!
Well, yeah, if you have an essay about infidel-killing, having the subheading for the part where you lay out the case for doing so describe said case as “pretty air-tight” isn’t exactly a heinous offence.
And you’re kind of skipping over considerations of what values Less Wrong tends to have. There’s a lot of effective altruism material, members of the community are disproportionately consequentialist, are you expecting little asides throughout the article saying “of course, this doesn’t apply to the 10% of you who are egoists”?
describe said case as “pretty air-tight” isn’t exactly a heinous offence
The question isn’t about the offence, the question is whether you would agree with this thesis in the context of an essay about islamic jihad.
There’s a lot of effective altruism material, members of the community are disproportionately consequentialist
Neither of these leads to vegetarianism. Consequentialism has nothing to do with it and EA means being rational (=effective) about helping others, but it certainly doesn’t tell you how wide the circle of those you should help must be.
I accept that neither of the things I listed logically lead to accepting the value claim made in the argument (other than that the effective altruism movement generally assumes one’s circle is at least as wide as “all humans”, considering the emphasis on charities working a continent away), but I still feel quite confident that LessWrongers are likely, and more likely than the general population, to accept said value claim—unless you want to argue about expected values, the assumption made seems to be “the width of the reader’s circle extends to all (meaningfully) sentient beings”, which is probably a lot more likely in a community like ours that reads a lot of sci-fi.
That bit reads to me as just a heading of one section of the article—a paragraph later it lays out the argument which is described as being “pretty air-tight”. Which argument does assume one has a particular kind of ethical system, but that’s not really the same thing as making the confusion you describe, especially when it’s an ethical system shared and trumpeted by many in the community.
Under this logic I can easily say “the intellectual case for killing infidels is pretty air-tight” or “the intellectual case for torturing suspects is pretty air-tight” because hey, we abstracted the values away!
Well, yeah, if you have an essay about infidel-killing, having the subheading for the part where you lay out the case for doing so describe said case as “pretty air-tight” isn’t exactly a heinous offence.
And you’re kind of skipping over considerations of what values Less Wrong tends to have. There’s a lot of effective altruism material, members of the community are disproportionately consequentialist, are you expecting little asides throughout the article saying “of course, this doesn’t apply to the 10% of you who are egoists”?
The question isn’t about the offence, the question is whether you would agree with this thesis in the context of an essay about islamic jihad.
Neither of these leads to vegetarianism. Consequentialism has nothing to do with it and EA means being rational (=effective) about helping others, but it certainly doesn’t tell you how wide the circle of those you should help must be.
I accept that neither of the things I listed logically lead to accepting the value claim made in the argument (other than that the effective altruism movement generally assumes one’s circle is at least as wide as “all humans”, considering the emphasis on charities working a continent away), but I still feel quite confident that LessWrongers are likely, and more likely than the general population, to accept said value claim—unless you want to argue about expected values, the assumption made seems to be “the width of the reader’s circle extends to all (meaningfully) sentient beings”, which is probably a lot more likely in a community like ours that reads a lot of sci-fi.
Oh, sure, the surveys will tell you so directly.
But “more likely than the general population” is pretty far from “doesn’t apply to the 10% of you who are egoists”.