I don’t know, seems like a very hard question, and I think will be quite sensitive to a bunch of details of the exact comparison. Like, how much cognitive diversity is there among the shrimp? Are the shrimps forming families and complicated social structures, or are they all in an isolated grid? Are they providing value to an extended ecosystem of other life? How rich is the life of these specific shrimp?
I mean… we know the answers to these questions, right? Like… shrimp are not some sort of… un-studied exotic form of life. (In any case it’s a moot point, see below.)
I would be surprised if the answer basically ever turned out to be less than 1.1, and surprised if it ever turned out to be more than 10,000.
Right, so, “some … multiplicative factor, larger than 1”. That’s what I assumed. Whether that factor is 1 million, or 1.1, really doesn’t make any difference to what I wrote earlier.
I don’t think your response said anything except to claim that a linear relationship between shrimp and values seems to quickly lead to absurd conclusions (or at least that is what I inferred from your claim of saying that a trillion shrimp is not a million times more valuable than a million shrimp). I agree with that as a valid reductio ad absurdum, but given that I see no need for linearity here (simply any ratio, which could even differ with the scale and details of the scenario), I don’t see how your response stands.
No, my point is that any factor at all that is larger than 1, and remains larger than 1 as numbers increase, leads to absurd conclusions. (Like, for example, the conclusion that there is some number of shrimp such that that many shrimp are worth more than a human life.)
Given this correction, do you still think that I’m strawmanning or misunderstanding your views…? (I repeat that linearity is not the target of my objection!)
No, my point is that any factor at all that is larger than 1, and remains larger than 1 as numbers increase
I mean, clearly you agree that two shrimp are more important than one shrimp, and continues to be more important (at least for a while) as the numbers increase. So no, I don’t understand what you are saying, as nothing you have said appears sensitive to any numbers being different, and clearly for small numbers you agree that these comparisons must hold.
I agree there is a number big enough where eventually you approach 1, nothing I have said contradicts that. As in, my guess is the series of the value of shrimp as n goes to infinity does not diverge but eventually converges on some finite number (though especially with considerations like boltzman brains and quantum uncertainty and matter/energy density does seem confusing to think about).
It seems quite likely to me that this point of convergence is above the value of a human life, as numbers can really get very big, there are a lot of humans, and shrimp are all things considered pretty cool and interesting and a lot of shrimp seem like they would give rise to a lot of stuff.
I mean, clearly you agree that two shrimp are more important than one shrimp
Hm… no, I don’t think so. Enough shrimp to ensure that there keep being shrimp—that’s worth more than one shrimp. Less shrimp than that, though—nah.
I agree there is a number big enough where eventually you approach 1, nothing I have said contradicts that. As in, my guess is the series of the value of shrimp as n goes to infinity does not diverge but eventually converge on some finite number, though it does feel kind of confusing to think about.
Sure, this is all fine (and nothing that I have said contradicts you believing this; it seems like you took my objection to be much narrower than it actually was), but you’re saying that this number is much larger than the value of a human life. That’s the thing that I’m objecting to.
I’ll mostly bow out at this point, but one quick clarification:
but you’re saying that this number is much larger than the value of a human life
I didn’t say “much larger”! Like, IDK, my guess is there is some number of shrimp for which its worth sacrificing a thousand humans, which is larger, but not necessarily “much”.
My guess is there is no number, at least in the least convenient world where we are not talking about shrimp galaxies forming alternative life forms, for which it’s worth sacrificing 10 million humans, at least at current population levels and on the current human trajectory.
10 million is just a lot, and humanity has a lot of shit to deal with, and while I think it would be an atrocity to destroy this shrimp-gigaverse, it would also be an atrocity to kill 10 million people, especially intentionally.
I mean… we know the answers to these questions, right? Like… shrimp are not some sort of… un-studied exotic form of life. (In any case it’s a moot point, see below.)
Right, so, “some … multiplicative factor, larger than 1”. That’s what I assumed. Whether that factor is 1 million, or 1.1, really doesn’t make any difference to what I wrote earlier.
No, my point is that any factor at all that is larger than 1, and remains larger than 1 as numbers increase, leads to absurd conclusions. (Like, for example, the conclusion that there is some number of shrimp such that that many shrimp are worth more than a human life.)
Given this correction, do you still think that I’m strawmanning or misunderstanding your views…? (I repeat that linearity is not the target of my objection!)
I mean, clearly you agree that two shrimp are more important than one shrimp, and continues to be more important (at least for a while) as the numbers increase. So no, I don’t understand what you are saying, as nothing you have said appears sensitive to any numbers being different, and clearly for small numbers you agree that these comparisons must hold.
I agree there is a number big enough where eventually you approach 1, nothing I have said contradicts that. As in, my guess is the series of the value of shrimp as n goes to infinity does not diverge but eventually converges on some finite number (though especially with considerations like boltzman brains and quantum uncertainty and matter/energy density does seem confusing to think about).
It seems quite likely to me that this point of convergence is above the value of a human life, as numbers can really get very big, there are a lot of humans, and shrimp are all things considered pretty cool and interesting and a lot of shrimp seem like they would give rise to a lot of stuff.
Hm… no, I don’t think so. Enough shrimp to ensure that there keep being shrimp—that’s worth more than one shrimp. Less shrimp than that, though—nah.
Sure, this is all fine (and nothing that I have said contradicts you believing this; it seems like you took my objection to be much narrower than it actually was), but you’re saying that this number is much larger than the value of a human life. That’s the thing that I’m objecting to.
I’ll mostly bow out at this point, but one quick clarification:
I didn’t say “much larger”! Like, IDK, my guess is there is some number of shrimp for which its worth sacrificing a thousand humans, which is larger, but not necessarily “much”.
My guess is there is no number, at least in the least convenient world where we are not talking about shrimp galaxies forming alternative life forms, for which it’s worth sacrificing 10 million humans, at least at current population levels and on the current human trajectory.
10 million is just a lot, and humanity has a lot of shit to deal with, and while I think it would be an atrocity to destroy this shrimp-gigaverse, it would also be an atrocity to kill 10 million people, especially intentionally.