I think the term “conscious” is very overloaded and the source of endless confusion and should be tabood. I’ll be answering as if the numbers are not “probability(-given-uncertainty) of conscious” but “expected(-given-uncertainty) amount of moral patienthood”, calibrated with 1 meaning “as much as a human” (it could go higher — some whales have more neurons/synapses than humans and so they might plausibly be more of a moral patient than humans, in the sense that in a trolley problem you should prefer to save 1000 such whales to 1001 humans).
Besides the trivia I just mentioned about whales, I’m answering this mostly on intuition, without knowing off the top of my head (nor looking up) the amount of neurons/synapses. Not to imply that moral patienthood is directly linear to amount of neurons/synapses, but I expect that that amount probably matters to my notion of moral patienthood.
I’ll also assume that everyone has a “normal amount of realityfluid” flowing through them (rather than eg being simulated slower, or being fictional, or having “double-thick neurons made of gold” in case that matters).
Edit: Thinking about it more, something feels weird here, like these numbers don’t track at all “how many of these would make me press the lever on the trolley problem vs 1 human” — for one, killing a sleeping person is about as bad as killing an awake person because like
the sleeping person is a temporarily-paused-backup for an awake person. I guess I should be thinking about “the universe has budget for one more hour of (good-)experience just before heat death, but it needs to be all same species, how much do I value each?” or something.
I think the term “conscious” is very overloaded and the source of endless confusion and should be tabood. I’ll be answering as if the numbers are not “probability(-given-uncertainty) of conscious” but “expected(-given-uncertainty) amount of moral patienthood”, calibrated with 1 meaning “as much as a human” (it could go higher — some whales have more neurons/synapses than humans and so they might plausibly be more of a moral patient than humans, in the sense that in a trolley problem you should prefer to save 1000 such whales to 1001 humans).
Besides the trivia I just mentioned about whales, I’m answering this mostly on intuition, without knowing off the top of my head (nor looking up) the amount of neurons/synapses. Not to imply that moral patienthood is directly linear to amount of neurons/synapses, but I expect that that amount probably matters to my notion of moral patienthood.
I’ll also assume that everyone has a “normal amount of realityfluid” flowing through them (rather than eg being simulated slower, or being fictional, or having “double-thick neurons made of gold” in case that matters).
First list: 1, 1, 1, .7, 10⁻², 10⁻³, 10⁻⁶, 10⁻⁶, 10⁻⁸, ε, ε, ε, ε, ε.
Second list: .6, .8, .7, .7, .6, .6, .5, ε, ε, ε, ε.
Edit: Thinking about it more, something feels weird here, like these numbers don’t track at all “how many of these would make me press the lever on the trolley problem vs 1 human” — for one, killing a sleeping person is about as bad as killing an awake person because like the sleeping person is a temporarily-paused-backup for an awake person. I guess I should be thinking about “the universe has budget for one more hour of (good-)experience just before heat death, but it needs to be all same species, how much do I value each?” or something.