I’d love to see folks here give their labels for where they think each of these objects are on the scale of:
A. so definitely not conscious that evidence otherwise is clearly in error, P<0.0000001
B. definitely not conscious in only the sense that you expect to see no evidence otherwise, P ~= 0.00001...0.01
C. unlikely to be conscious, P~=0.01...0.10
D. plausible but unclear, P~=0.10...0.98
E. as plausible as anything besides yourself, P>0.98
F. as definitely conscious as you are now, and any evidence otherwise is clearly in error, P>0.999999
Things to label:
You
Me
Arbitrary other awake humans
Sleeping humans
Insects
Nematode worms (C. elegans)
Single-celled organisms, bacteria
Cells in larger organisms
Viruses in the process of infecting a host
Thermostats
Keyboards
Two-sensor line following robots
Complex chemical reactions, eg the Belousov–Zhabotinsky reaction
Individual molecules
edit: some more from Richard’s “elephants” suggestion:
crows
elephants
dogs
cats
squirrels
mice
small lizards
beginning of a chemical reaction when the reactants are first mixed, and there’s significant energy gradient still
rock or glass (sitting stationary)
rock or glass (in the process of shattering)
rock or glass (in the process of colliding with something not hard enough to shatter)
bonus points for also labeling each one with what you think it’s conscious of, as I think that’s a necessary addendum to discuss the easy problem. but the above is intended to discuss the hard problem and easy problem combined.
I think the term “conscious” is very overloaded and the source of endless confusion and should be tabood. I’ll be answering as if the numbers are not “probability(-given-uncertainty) of conscious” but “expected(-given-uncertainty) amount of moral patienthood”, calibrated with 1 meaning “as much as a human” (it could go higher — some whales have more neurons/synapses than humans and so they might plausibly be more of a moral patient than humans, in the sense that in a trolley problem you should prefer to save 1000 such whales to 1001 humans).
Besides the trivia I just mentioned about whales, I’m answering this mostly on intuition, without knowing off the top of my head (nor looking up) the amount of neurons/synapses. Not to imply that moral patienthood is directly linear to amount of neurons/synapses, but I expect that that amount probably matters to my notion of moral patienthood.
I’ll also assume that everyone has a “normal amount of realityfluid” flowing through them (rather than eg being simulated slower, or being fictional, or having “double-thick neurons made of gold” in case that matters).
Edit: Thinking about it more, something feels weird here, like these numbers don’t track at all “how many of these would make me press the lever on the trolley problem vs 1 human” — for one, killing a sleeping person is about as bad as killing an awake person because like
the sleeping person is a temporarily-paused-backup for an awake person. I guess I should be thinking about “the universe has budget for one more hour of (good-)experience just before heat death, but it needs to be all same species, how much do I value each?” or something.
The As start at “Single-celled organisms, bacteria”.
There’s rather a jump in your list from sleeping humans to insects. How about, say, elephants? I’d give them a D.
ETA: For the second list: E or D for all the animals (I’m not sure where I’d switch, if at all), A for the rest. I’d go down to C for, say, earthworms, but what I would really mean is more of a D for there being anything like consciousness there, but it would only be a smidgeon.
To summarize my position: (anyone who’s going to answer, please answer before reading this):
Please don’t click this next spoiler if you haven’t replied like Richard did, unless you’re sure you’re not going to reply!
The only A that seems easily defensible to me is the last one, and I would put C for it. Everything else seems like it could quite plausibly have (potentially millionths or trillionths of human brain level) alien consciousness to me, B or better.
My answers:
first list (you answered): F, F, F, D, E, D, D, D, C, D, D, D, C, C
in fact, I left off one thing: a rock. This is, to my intuition, a special case of the “individual molecules” case, but is the only one where I’d put B rather than C—because how can a brain be conscious if individual molecules can’t be conscious? presumably whatever happens to the molecules in the brain that constitutes consciousness, the property still exists at the atomic scale, presumably something to do with either the bare fact of existence (hard problem of consciousness) or the way the molecule is currently interacting (integrated information theory type stuff). so I added a second list to cover this stuff, and here are my answers for it:
second list (added in response to your suggestion): E, E, E, E, E, E, E, C, A, D, D
that goes the other direction, claiming things about the whole because of a claim about a part. I’m claiming something about a part must somehow add up to the behavior of a whole.
Discussions of consciousness without defining which meaning you’re using will usually cause confusion and arguments, as Tamsin Leake points out.
Consciousness is used to label many aspects of human cognition. Which is complex. So consciousness almost means human-like, but without specifying on which dimensions you’re defining the analogy.
I’d love to see folks here give their labels for where they think each of these objects are on the scale of:
A. so definitely not conscious that evidence otherwise is clearly in error, P<0.0000001
B. definitely not conscious in only the sense that you expect to see no evidence otherwise, P ~= 0.00001...0.01
C. unlikely to be conscious, P~=0.01...0.10
D. plausible but unclear, P~=0.10...0.98
E. as plausible as anything besides yourself, P>0.98
F. as definitely conscious as you are now, and any evidence otherwise is clearly in error, P>0.999999
Things to label:
You
Me
Arbitrary other awake humans
Sleeping humans
Insects
Nematode worms (C. elegans)
Single-celled organisms, bacteria
Cells in larger organisms
Viruses in the process of infecting a host
Thermostats
Keyboards
Two-sensor line following robots
Complex chemical reactions, eg the Belousov–Zhabotinsky reaction
Individual molecules
edit: some more from Richard’s “elephants” suggestion:
crows
elephants
dogs
cats
squirrels
mice
small lizards
beginning of a chemical reaction when the reactants are first mixed, and there’s significant energy gradient still
rock or glass (sitting stationary)
rock or glass (in the process of shattering)
rock or glass (in the process of colliding with something not hard enough to shatter)
bonus points for also labeling each one with what you think it’s conscious of, as I think that’s a necessary addendum to discuss the easy problem. but the above is intended to discuss the hard problem and easy problem combined.
I think the term “conscious” is very overloaded and the source of endless confusion and should be tabood. I’ll be answering as if the numbers are not “probability(-given-uncertainty) of conscious” but “expected(-given-uncertainty) amount of moral patienthood”, calibrated with 1 meaning “as much as a human” (it could go higher — some whales have more neurons/synapses than humans and so they might plausibly be more of a moral patient than humans, in the sense that in a trolley problem you should prefer to save 1000 such whales to 1001 humans).
Besides the trivia I just mentioned about whales, I’m answering this mostly on intuition, without knowing off the top of my head (nor looking up) the amount of neurons/synapses. Not to imply that moral patienthood is directly linear to amount of neurons/synapses, but I expect that that amount probably matters to my notion of moral patienthood.
I’ll also assume that everyone has a “normal amount of realityfluid” flowing through them (rather than eg being simulated slower, or being fictional, or having “double-thick neurons made of gold” in case that matters).
First list: 1, 1, 1, .7, 10⁻², 10⁻³, 10⁻⁶, 10⁻⁶, 10⁻⁸, ε, ε, ε, ε, ε.
Second list: .6, .8, .7, .7, .6, .6, .5, ε, ε, ε, ε.
Edit: Thinking about it more, something feels weird here, like these numbers don’t track at all “how many of these would make me press the lever on the trolley problem vs 1 human” — for one, killing a sleeping person is about as bad as killing an awake person because like the sleeping person is a temporarily-paused-backup for an awake person. I guess I should be thinking about “the universe has budget for one more hour of (good-)experience just before heat death, but it needs to be all same species, how much do I value each?” or something.
F, E, E, D, C, B, A, A, A, A, A, A, A, A
The As start at “Single-celled organisms, bacteria”.
There’s rather a jump in your list from sleeping humans to insects. How about, say, elephants? I’d give them a D.
ETA: For the second list: E or D for all the animals (I’m not sure where I’d switch, if at all), A for the rest. I’d go down to C for, say, earthworms, but what I would really mean is more of a D for there being anything like consciousness there, but it would only be a smidgeon.
To summarize my position: (anyone who’s going to answer, please answer before reading this):
Please don’t click this next spoiler if you haven’t replied like Richard did, unless you’re sure you’re not going to reply!
The only A that seems easily defensible to me is the last one, and I would put C for it. Everything else seems like it could quite plausibly have (potentially millionths or trillionths of human brain level) alien consciousness to me, B or better.
My answers:
first list (you answered): F, F, F, D, E, D, D, D, C, D, D, D, C, C
in fact, I left off one thing: a rock. This is, to my intuition, a special case of the “individual molecules” case, but is the only one where I’d put B rather than C—because how can a brain be conscious if individual molecules can’t be conscious? presumably whatever happens to the molecules in the brain that constitutes consciousness, the property still exists at the atomic scale, presumably something to do with either the bare fact of existence (hard problem of consciousness) or the way the molecule is currently interacting (integrated information theory type stuff). so I added a second list to cover this stuff, and here are my answers for it:
second list (added in response to your suggestion): E, E, E, E, E, E, E, C, A, D, D
There’s a name for that one.
that goes the other direction, claiming things about the whole because of a claim about a part. I’m claiming something about a part must somehow add up to the behavior of a whole.
Splitting hairs. If something true of each part is not true of the whole, then something true of the whole is not true of each part.
No part of a car is a car, yet there is the car. How this can be is not a deep problem.
All B (Yes, I know that variants of eliminativism are not popular here, but I was instructed to answer before reading further)
Discussions of consciousness without defining which meaning you’re using will usually cause confusion and arguments, as Tamsin Leake points out.
Consciousness is used to label many aspects of human cognition. Which is complex. So consciousness almost means human-like, but without specifying on which dimensions you’re defining the analogy.