Even if one tries to elevate your family above everything else, it is commonly accepted that it is not moral to sacrifice all of society for just your family, or threaten large scale catastrophe.
This just means that “elevate your family above everything else” is not an approved-of moral principle, not that it somehow doesn’t work on its own terms. In any case this is not a problem with multi-tier morality, it’s just a disagreement on what the tiers should be.
Similarly as you elevate the interests of your nation above other things, at a sufficient scale the interests of the rest of the world poke their way into your decision-making in substantial ways again.
This, on the other hand, is a matter of instrumental values, not terminal ones. There is once again no problem here with multi-tier morality.
Even if you try to do nothing but elevate the interests of animal life, we have still decided that it is not ethical to destroy even fully abiological and definitely not complicated plant-based ecosystems for those interests, if the harm is sufficiently large.
Same reply as to the first point. (Also, who has ever advocated so weirdly drawn a moral principle as “do nothing but elevate the interests of animal life”…?)
It doesn’t matter how many shrimp it is.
That is false. The numbers are very big. There are numbers so big that the very presence of specfiying them would encode calculations capable of simulating universes full of healthy and happy humans. It absolutely matters how big this kind of number is.
It doesn’t matter how big the numbers are, because the moral value of shrimp does not aggregate like that. If it were 3^^^3 shrimp, it still wouldn’t matter.
Again, we are talking about so many shrimp that it would be exceedingly unlike for this number of shrimp, if left under the auspices of gravity, to form their own planets and solar systems and galaxies in which life thrives and which other non-shrimp intelligences form.
Now you’re just smuggling in additional hypothesized entities and concerns. Are we talking about shrimp, or about something else? This is basically a red herring.
That aside—no, the numbers really don’t matter, because that’s just not how moral value of shrimp works, in any remotely sensible moral system. A trillion shrimp do not have a million times the moral value of a million shrimp. If your morality says that they do, then your morality is broken.
A trillion shrimp do not have a million times the moral value of a million shrimp. If your morality says that they do, then your morality is broken.
Nobody was saying this! The author of the post in question also does not believe this!
I am not a hedonic utilitarian. I do not think that a trillion shrimp have a million times the moral value of a million shrimp. That is a much much stronger statement than whether there exists any number of shrimp that might be worth more than a human. All you’ve done here is to set up a total strawman that nobody was arguing for and knocked it down.
… 1,000 times the moral value of a million shrimp?
… 10 times the moral value of a million shrimp?
… 1.1 times the moral value of a million shrimp?
… some other multiplicative factor, larger than 1, times the moral value of a million shrimp?
If the answer is “no” to all of these, then that seems like it would mean that you already agree with me, and your previous comments here wouldn’t make any sense. So it seems like the answer has to be “yes” to something in that list.
But then… my response stands, except with the relevant number changed.
On the other hand, you also say:
I am not a hedonic utilitarian.
I… don’t understand how you could be using this term that would make this a meaningful or relevant thing to say in response to my comment. Ok, you’re not a hedonic utilitarian, and thus… what?
Is the point that your claim about saving 10100 shrimp instead of one human isn’t insane… was actually not a moral claim at all, but some other kind of claim (prudential, for instance)? No, that doesn’t seem to work either, because you wrote:
No, being extremely overwhelmingly confident about morality such that even if you are given a choice to drastically alter 99.999999999999999999999% of the matter in the universe, you call the side of not destroying it “insane” for not wanting to give up a single human life, a thing we do routinely for much weaker considerations, is insane.
So clearly this is about morality…
… yeah, I can’t make any sense of what you’re saying here. What am I missing?
… 1,000 times the moral value of a million shrimp?
… 10 times the moral value of a million shrimp?
… 1.1 times the moral value of a million shrimp?
… some other multiplicative factor, larger than 1, times the moral value of a million shrimp?
I don’t know, seems like a very hard question, and I think will be quite sensitive to a bunch of details of the exact comparison. Like, how much cognitive diversity is there among the shrimp? Are the shrimps forming families and complicated social structures, or are they all in an isolated grid? Are they providing value to an extended ecosystem of other life? How rich is the life of these specific shrimp?
I would be surprised if the answer basically ever turned out to be less than 1.1, and surprised if it ever turned out to be more than 10,000.
But then… my response stands, except with the relevant number changed.
I don’t think your response said anything except to claim that a linear relationship between shrimp and values seems to quickly lead to absurd conclusions (or at least that is what I inferred from your claim of saying that a trillion shrimp is not a million times more valuable than a million shrimp). I agree with that as a valid reductio ad absurdum, but given that I see no need for linearity here (simply any ratio, which could even differ with the scale and details of the scenario), I don’t see how your response stands.
… yeah, I can’t make any sense of what you’re saying here. What am I missing?
I have little to go off of besides to repeat myself, as you have given me little to work with besides repeated insistence that what I believe is wrong or absurd. My guess is my meaning is more clear (though probably still far from perfectly clear) to other readers.
I don’t know, seems like a very hard question, and I think will be quite sensitive to a bunch of details of the exact comparison. Like, how much cognitive diversity is there among the shrimp? Are the shrimps forming families and complicated social structures, or are they all in an isolated grid? Are they providing value to an extended ecosystem of other life? How rich is the life of these specific shrimp?
I mean… we know the answers to these questions, right? Like… shrimp are not some sort of… un-studied exotic form of life. (In any case it’s a moot point, see below.)
I would be surprised if the answer basically ever turned out to be less than 1.1, and surprised if it ever turned out to be more than 10,000.
Right, so, “some … multiplicative factor, larger than 1”. That’s what I assumed. Whether that factor is 1 million, or 1.1, really doesn’t make any difference to what I wrote earlier.
I don’t think your response said anything except to claim that a linear relationship between shrimp and values seems to quickly lead to absurd conclusions (or at least that is what I inferred from your claim of saying that a trillion shrimp is not a million times more valuable than a million shrimp). I agree with that as a valid reductio ad absurdum, but given that I see no need for linearity here (simply any ratio, which could even differ with the scale and details of the scenario), I don’t see how your response stands.
No, my point is that any factor at all that is larger than 1, and remains larger than 1 as numbers increase, leads to absurd conclusions. (Like, for example, the conclusion that there is some number of shrimp such that that many shrimp are worth more than a human life.)
Given this correction, do you still think that I’m strawmanning or misunderstanding your views…? (I repeat that linearity is not the target of my objection!)
No, my point is that any factor at all that is larger than 1, and remains larger than 1 as numbers increase
I mean, clearly you agree that two shrimp are more important than one shrimp, and continues to be more important (at least for a while) as the numbers increase. So no, I don’t understand what you are saying, as nothing you have said appears sensitive to any numbers being different, and clearly for small numbers you agree that these comparisons must hold.
I agree there is a number big enough where eventually you approach 1, nothing I have said contradicts that. As in, my guess is the series of the value of shrimp as n goes to infinity does not diverge but eventually converges on some finite number (though especially with considerations like boltzman brains and quantum uncertainty and matter/energy density does seem confusing to think about).
It seems quite likely to me that this point of convergence is above the value of a human life, as numbers can really get very big, there are a lot of humans, and shrimp are all things considered pretty cool and interesting and a lot of shrimp seem like they would give rise to a lot of stuff.
I mean, clearly you agree that two shrimp are more important than one shrimp
Hm… no, I don’t think so. Enough shrimp to ensure that there keep being shrimp—that’s worth more than one shrimp. Less shrimp than that, though—nah.
I agree there is a number big enough where eventually you approach 1, nothing I have said contradicts that. As in, my guess is the series of the value of shrimp as n goes to infinity does not diverge but eventually converge on some finite number, though it does feel kind of confusing to think about.
Sure, this is all fine (and nothing that I have said contradicts you believing this; it seems like you took my objection to be much narrower than it actually was), but you’re saying that this number is much larger than the value of a human life. That’s the thing that I’m objecting to.
I’ll mostly bow out at this point, but one quick clarification:
but you’re saying that this number is much larger than the value of a human life
I didn’t say “much larger”! Like, IDK, my guess is there is some number of shrimp for which its worth sacrificing a thousand humans, which is larger, but not necessarily “much”.
My guess is there is no number, at least in the least convenient world where we are not talking about shrimp galaxies forming alternative life forms, for which it’s worth sacrificing 10 million humans, at least at current population levels and on the current human trajectory.
10 million is just a lot, and humanity has a lot of shit to deal with, and while I think it would be an atrocity to destroy this shrimp-gigaverse, it would also be an atrocity to kill 10 million people, especially intentionally.
This just means that “elevate your family above everything else” is not an approved-of moral principle, not that it somehow doesn’t work on its own terms. In any case this is not a problem with multi-tier morality, it’s just a disagreement on what the tiers should be.
This, on the other hand, is a matter of instrumental values, not terminal ones. There is once again no problem here with multi-tier morality.
Same reply as to the first point. (Also, who has ever advocated so weirdly drawn a moral principle as “do nothing but elevate the interests of animal life”…?)
It doesn’t matter how big the numbers are, because the moral value of shrimp does not aggregate like that. If it were 3^^^3 shrimp, it still wouldn’t matter.
Now you’re just smuggling in additional hypothesized entities and concerns. Are we talking about shrimp, or about something else? This is basically a red herring.
That aside—no, the numbers really don’t matter, because that’s just not how moral value of shrimp works, in any remotely sensible moral system. A trillion shrimp do not have a million times the moral value of a million shrimp. If your morality says that they do, then your morality is broken.
Nobody was saying this! The author of the post in question also does not believe this!
I am not a hedonic utilitarian. I do not think that a trillion shrimp have a million times the moral value of a million shrimp. That is a much much stronger statement than whether there exists any number of shrimp that might be worth more than a human. All you’ve done here is to set up a total strawman that nobody was arguing for and knocked it down.
Ok. Do you think that a trillion shrimp have:
… 1,000 times the moral value of a million shrimp?
… 10 times the moral value of a million shrimp?
… 1.1 times the moral value of a million shrimp?
… some other multiplicative factor, larger than 1, times the moral value of a million shrimp?
If the answer is “no” to all of these, then that seems like it would mean that you already agree with me, and your previous comments here wouldn’t make any sense. So it seems like the answer has to be “yes” to something in that list.
But then… my response stands, except with the relevant number changed.
On the other hand, you also say:
I… don’t understand how you could be using this term that would make this a meaningful or relevant thing to say in response to my comment. Ok, you’re not a hedonic utilitarian, and thus… what?
Is the point that your claim about saving 10100 shrimp instead of one human isn’t insane… was actually not a moral claim at all, but some other kind of claim (prudential, for instance)? No, that doesn’t seem to work either, because you wrote:
So clearly this is about morality…
… yeah, I can’t make any sense of what you’re saying here. What am I missing?
I don’t know, seems like a very hard question, and I think will be quite sensitive to a bunch of details of the exact comparison. Like, how much cognitive diversity is there among the shrimp? Are the shrimps forming families and complicated social structures, or are they all in an isolated grid? Are they providing value to an extended ecosystem of other life? How rich is the life of these specific shrimp?
I would be surprised if the answer basically ever turned out to be less than 1.1, and surprised if it ever turned out to be more than 10,000.
I don’t think your response said anything except to claim that a linear relationship between shrimp and values seems to quickly lead to absurd conclusions (or at least that is what I inferred from your claim of saying that a trillion shrimp is not a million times more valuable than a million shrimp). I agree with that as a valid reductio ad absurdum, but given that I see no need for linearity here (simply any ratio, which could even differ with the scale and details of the scenario), I don’t see how your response stands.
I have little to go off of besides to repeat myself, as you have given me little to work with besides repeated insistence that what I believe is wrong or absurd. My guess is my meaning is more clear (though probably still far from perfectly clear) to other readers.
I mean… we know the answers to these questions, right? Like… shrimp are not some sort of… un-studied exotic form of life. (In any case it’s a moot point, see below.)
Right, so, “some … multiplicative factor, larger than 1”. That’s what I assumed. Whether that factor is 1 million, or 1.1, really doesn’t make any difference to what I wrote earlier.
No, my point is that any factor at all that is larger than 1, and remains larger than 1 as numbers increase, leads to absurd conclusions. (Like, for example, the conclusion that there is some number of shrimp such that that many shrimp are worth more than a human life.)
Given this correction, do you still think that I’m strawmanning or misunderstanding your views…? (I repeat that linearity is not the target of my objection!)
I mean, clearly you agree that two shrimp are more important than one shrimp, and continues to be more important (at least for a while) as the numbers increase. So no, I don’t understand what you are saying, as nothing you have said appears sensitive to any numbers being different, and clearly for small numbers you agree that these comparisons must hold.
I agree there is a number big enough where eventually you approach 1, nothing I have said contradicts that. As in, my guess is the series of the value of shrimp as n goes to infinity does not diverge but eventually converges on some finite number (though especially with considerations like boltzman brains and quantum uncertainty and matter/energy density does seem confusing to think about).
It seems quite likely to me that this point of convergence is above the value of a human life, as numbers can really get very big, there are a lot of humans, and shrimp are all things considered pretty cool and interesting and a lot of shrimp seem like they would give rise to a lot of stuff.
Hm… no, I don’t think so. Enough shrimp to ensure that there keep being shrimp—that’s worth more than one shrimp. Less shrimp than that, though—nah.
Sure, this is all fine (and nothing that I have said contradicts you believing this; it seems like you took my objection to be much narrower than it actually was), but you’re saying that this number is much larger than the value of a human life. That’s the thing that I’m objecting to.
I’ll mostly bow out at this point, but one quick clarification:
I didn’t say “much larger”! Like, IDK, my guess is there is some number of shrimp for which its worth sacrificing a thousand humans, which is larger, but not necessarily “much”.
My guess is there is no number, at least in the least convenient world where we are not talking about shrimp galaxies forming alternative life forms, for which it’s worth sacrificing 10 million humans, at least at current population levels and on the current human trajectory.
10 million is just a lot, and humanity has a lot of shit to deal with, and while I think it would be an atrocity to destroy this shrimp-gigaverse, it would also be an atrocity to kill 10 million people, especially intentionally.