A trillion shrimp do not have a million times the moral value of a million shrimp. If your morality says that they do, then your morality is broken.
Nobody was saying this! The author of the post in question also does not believe this!
I am not a hedonic utilitarian. I do not think that a trillion shrimp have a million times the moral value of a million shrimp. That is a much much stronger statement than whether there exists any number of shrimp that might be worth more than a human. All you’ve done here is to set up a total strawman that nobody was arguing for and knocked it down.
… 1,000 times the moral value of a million shrimp?
… 10 times the moral value of a million shrimp?
… 1.1 times the moral value of a million shrimp?
… some other multiplicative factor, larger than 1, times the moral value of a million shrimp?
If the answer is “no” to all of these, then that seems like it would mean that you already agree with me, and your previous comments here wouldn’t make any sense. So it seems like the answer has to be “yes” to something in that list.
But then… my response stands, except with the relevant number changed.
On the other hand, you also say:
I am not a hedonic utilitarian.
I… don’t understand how you could be using this term that would make this a meaningful or relevant thing to say in response to my comment. Ok, you’re not a hedonic utilitarian, and thus… what?
Is the point that your claim about saving 10100 shrimp instead of one human isn’t insane… was actually not a moral claim at all, but some other kind of claim (prudential, for instance)? No, that doesn’t seem to work either, because you wrote:
No, being extremely overwhelmingly confident about morality such that even if you are given a choice to drastically alter 99.999999999999999999999% of the matter in the universe, you call the side of not destroying it “insane” for not wanting to give up a single human life, a thing we do routinely for much weaker considerations, is insane.
So clearly this is about morality…
… yeah, I can’t make any sense of what you’re saying here. What am I missing?
… 1,000 times the moral value of a million shrimp?
… 10 times the moral value of a million shrimp?
… 1.1 times the moral value of a million shrimp?
… some other multiplicative factor, larger than 1, times the moral value of a million shrimp?
I don’t know, seems like a very hard question, and I think will be quite sensitive to a bunch of details of the exact comparison. Like, how much cognitive diversity is there among the shrimp? Are the shrimps forming families and complicated social structures, or are they all in an isolated grid? Are they providing value to an extended ecosystem of other life? How rich is the life of these specific shrimp?
I would be surprised if the answer basically ever turned out to be less than 1.1, and surprised if it ever turned out to be more than 10,000.
But then… my response stands, except with the relevant number changed.
I don’t think your response said anything except to claim that a linear relationship between shrimp and values seems to quickly lead to absurd conclusions (or at least that is what I inferred from your claim of saying that a trillion shrimp is not a million times more valuable than a million shrimp). I agree with that as a valid reductio ad absurdum, but given that I see no need for linearity here (simply any ratio, which could even differ with the scale and details of the scenario), I don’t see how your response stands.
… yeah, I can’t make any sense of what you’re saying here. What am I missing?
I have little to go off of besides to repeat myself, as you have given me little to work with besides repeated insistence that what I believe is wrong or absurd. My guess is my meaning is more clear (though probably still far from perfectly clear) to other readers.
I don’t know, seems like a very hard question, and I think will be quite sensitive to a bunch of details of the exact comparison. Like, how much cognitive diversity is there among the shrimp? Are the shrimps forming families and complicated social structures, or are they all in an isolated grid? Are they providing value to an extended ecosystem of other life? How rich is the life of these specific shrimp?
I mean… we know the answers to these questions, right? Like… shrimp are not some sort of… un-studied exotic form of life. (In any case it’s a moot point, see below.)
I would be surprised if the answer basically ever turned out to be less than 1.1, and surprised if it ever turned out to be more than 10,000.
Right, so, “some … multiplicative factor, larger than 1”. That’s what I assumed. Whether that factor is 1 million, or 1.1, really doesn’t make any difference to what I wrote earlier.
I don’t think your response said anything except to claim that a linear relationship between shrimp and values seems to quickly lead to absurd conclusions (or at least that is what I inferred from your claim of saying that a trillion shrimp is not a million times more valuable than a million shrimp). I agree with that as a valid reductio ad absurdum, but given that I see no need for linearity here (simply any ratio, which could even differ with the scale and details of the scenario), I don’t see how your response stands.
No, my point is that any factor at all that is larger than 1, and remains larger than 1 as numbers increase, leads to absurd conclusions. (Like, for example, the conclusion that there is some number of shrimp such that that many shrimp are worth more than a human life.)
Given this correction, do you still think that I’m strawmanning or misunderstanding your views…? (I repeat that linearity is not the target of my objection!)
No, my point is that any factor at all that is larger than 1, and remains larger than 1 as numbers increase
I mean, clearly you agree that two shrimp are more important than one shrimp, and continues to be more important (at least for a while) as the numbers increase. So no, I don’t understand what you are saying, as nothing you have said appears sensitive to any numbers being different, and clearly for small numbers you agree that these comparisons must hold.
I agree there is a number big enough where eventually you approach 1, nothing I have said contradicts that. As in, my guess is the series of the value of shrimp as n goes to infinity does not diverge but eventually converges on some finite number (though especially with considerations like boltzman brains and quantum uncertainty and matter/energy density does seem confusing to think about).
It seems quite likely to me that this point of convergence is above the value of a human life, as numbers can really get very big, there are a lot of humans, and shrimp are all things considered pretty cool and interesting and a lot of shrimp seem like they would give rise to a lot of stuff.
I mean, clearly you agree that two shrimp are more important than one shrimp
Hm… no, I don’t think so. Enough shrimp to ensure that there keep being shrimp—that’s worth more than one shrimp. Less shrimp than that, though—nah.
I agree there is a number big enough where eventually you approach 1, nothing I have said contradicts that. As in, my guess is the series of the value of shrimp as n goes to infinity does not diverge but eventually converge on some finite number, though it does feel kind of confusing to think about.
Sure, this is all fine (and nothing that I have said contradicts you believing this; it seems like you took my objection to be much narrower than it actually was), but you’re saying that this number is much larger than the value of a human life. That’s the thing that I’m objecting to.
I’ll mostly bow out at this point, but one quick clarification:
but you’re saying that this number is much larger than the value of a human life
I didn’t say “much larger”! Like, IDK, my guess is there is some number of shrimp for which its worth sacrificing a thousand humans, which is larger, but not necessarily “much”.
My guess is there is no number, at least in the least convenient world where we are not talking about shrimp galaxies forming alternative life forms, for which it’s worth sacrificing 10 million humans, at least at current population levels and on the current human trajectory.
10 million is just a lot, and humanity has a lot of shit to deal with, and while I think it would be an atrocity to destroy this shrimp-gigaverse, it would also be an atrocity to kill 10 million people, especially intentionally.
Nobody was saying this! The author of the post in question also does not believe this!
I am not a hedonic utilitarian. I do not think that a trillion shrimp have a million times the moral value of a million shrimp. That is a much much stronger statement than whether there exists any number of shrimp that might be worth more than a human. All you’ve done here is to set up a total strawman that nobody was arguing for and knocked it down.
Ok. Do you think that a trillion shrimp have:
… 1,000 times the moral value of a million shrimp?
… 10 times the moral value of a million shrimp?
… 1.1 times the moral value of a million shrimp?
… some other multiplicative factor, larger than 1, times the moral value of a million shrimp?
If the answer is “no” to all of these, then that seems like it would mean that you already agree with me, and your previous comments here wouldn’t make any sense. So it seems like the answer has to be “yes” to something in that list.
But then… my response stands, except with the relevant number changed.
On the other hand, you also say:
I… don’t understand how you could be using this term that would make this a meaningful or relevant thing to say in response to my comment. Ok, you’re not a hedonic utilitarian, and thus… what?
Is the point that your claim about saving 10100 shrimp instead of one human isn’t insane… was actually not a moral claim at all, but some other kind of claim (prudential, for instance)? No, that doesn’t seem to work either, because you wrote:
So clearly this is about morality…
… yeah, I can’t make any sense of what you’re saying here. What am I missing?
I don’t know, seems like a very hard question, and I think will be quite sensitive to a bunch of details of the exact comparison. Like, how much cognitive diversity is there among the shrimp? Are the shrimps forming families and complicated social structures, or are they all in an isolated grid? Are they providing value to an extended ecosystem of other life? How rich is the life of these specific shrimp?
I would be surprised if the answer basically ever turned out to be less than 1.1, and surprised if it ever turned out to be more than 10,000.
I don’t think your response said anything except to claim that a linear relationship between shrimp and values seems to quickly lead to absurd conclusions (or at least that is what I inferred from your claim of saying that a trillion shrimp is not a million times more valuable than a million shrimp). I agree with that as a valid reductio ad absurdum, but given that I see no need for linearity here (simply any ratio, which could even differ with the scale and details of the scenario), I don’t see how your response stands.
I have little to go off of besides to repeat myself, as you have given me little to work with besides repeated insistence that what I believe is wrong or absurd. My guess is my meaning is more clear (though probably still far from perfectly clear) to other readers.
I mean… we know the answers to these questions, right? Like… shrimp are not some sort of… un-studied exotic form of life. (In any case it’s a moot point, see below.)
Right, so, “some … multiplicative factor, larger than 1”. That’s what I assumed. Whether that factor is 1 million, or 1.1, really doesn’t make any difference to what I wrote earlier.
No, my point is that any factor at all that is larger than 1, and remains larger than 1 as numbers increase, leads to absurd conclusions. (Like, for example, the conclusion that there is some number of shrimp such that that many shrimp are worth more than a human life.)
Given this correction, do you still think that I’m strawmanning or misunderstanding your views…? (I repeat that linearity is not the target of my objection!)
I mean, clearly you agree that two shrimp are more important than one shrimp, and continues to be more important (at least for a while) as the numbers increase. So no, I don’t understand what you are saying, as nothing you have said appears sensitive to any numbers being different, and clearly for small numbers you agree that these comparisons must hold.
I agree there is a number big enough where eventually you approach 1, nothing I have said contradicts that. As in, my guess is the series of the value of shrimp as n goes to infinity does not diverge but eventually converges on some finite number (though especially with considerations like boltzman brains and quantum uncertainty and matter/energy density does seem confusing to think about).
It seems quite likely to me that this point of convergence is above the value of a human life, as numbers can really get very big, there are a lot of humans, and shrimp are all things considered pretty cool and interesting and a lot of shrimp seem like they would give rise to a lot of stuff.
Hm… no, I don’t think so. Enough shrimp to ensure that there keep being shrimp—that’s worth more than one shrimp. Less shrimp than that, though—nah.
Sure, this is all fine (and nothing that I have said contradicts you believing this; it seems like you took my objection to be much narrower than it actually was), but you’re saying that this number is much larger than the value of a human life. That’s the thing that I’m objecting to.
I’ll mostly bow out at this point, but one quick clarification:
I didn’t say “much larger”! Like, IDK, my guess is there is some number of shrimp for which its worth sacrificing a thousand humans, which is larger, but not necessarily “much”.
My guess is there is no number, at least in the least convenient world where we are not talking about shrimp galaxies forming alternative life forms, for which it’s worth sacrificing 10 million humans, at least at current population levels and on the current human trajectory.
10 million is just a lot, and humanity has a lot of shit to deal with, and while I think it would be an atrocity to destroy this shrimp-gigaverse, it would also be an atrocity to kill 10 million people, especially intentionally.