describe said case as “pretty air-tight” isn’t exactly a heinous offence
The question isn’t about the offence, the question is whether you would agree with this thesis in the context of an essay about islamic jihad.
There’s a lot of effective altruism material, members of the community are disproportionately consequentialist
Neither of these leads to vegetarianism. Consequentialism has nothing to do with it and EA means being rational (=effective) about helping others, but it certainly doesn’t tell you how wide the circle of those you should help must be.
I accept that neither of the things I listed logically lead to accepting the value claim made in the argument (other than that the effective altruism movement generally assumes one’s circle is at least as wide as “all humans”, considering the emphasis on charities working a continent away), but I still feel quite confident that LessWrongers are likely, and more likely than the general population, to accept said value claim—unless you want to argue about expected values, the assumption made seems to be “the width of the reader’s circle extends to all (meaningfully) sentient beings”, which is probably a lot more likely in a community like ours that reads a lot of sci-fi.
The question isn’t about the offence, the question is whether you would agree with this thesis in the context of an essay about islamic jihad.
Neither of these leads to vegetarianism. Consequentialism has nothing to do with it and EA means being rational (=effective) about helping others, but it certainly doesn’t tell you how wide the circle of those you should help must be.
I accept that neither of the things I listed logically lead to accepting the value claim made in the argument (other than that the effective altruism movement generally assumes one’s circle is at least as wide as “all humans”, considering the emphasis on charities working a continent away), but I still feel quite confident that LessWrongers are likely, and more likely than the general population, to accept said value claim—unless you want to argue about expected values, the assumption made seems to be “the width of the reader’s circle extends to all (meaningfully) sentient beings”, which is probably a lot more likely in a community like ours that reads a lot of sci-fi.
Oh, sure, the surveys will tell you so directly.
But “more likely than the general population” is pretty far from “doesn’t apply to the 10% of you who are egoists”.