Chimera writes: I’m a bit worried about people getting angry at me for not “getting it”
You are what? Worried? Worried is a conscious experience. A movie of you being worried does not show someone else being worried, it shows an unconscious image that looks like you being worried. An automaton built to duplicate your behavior when you are worried feels nothing, there is nothing (no consciousness) there to feel anything, but when you are doing that stuff people know and more importantly, you know how you feel and what it means to feel worried.
Imagine a world filled with disney animatronic robots all programmed to behave like real world people in our world behaved. Unless you think all those singing ghosts in the Haunted Mansion at disneyland are feeling happy and scared, then you can know what is being discussed here by imagining the difference between what images of people feel (nothing) and what actual people feel.
I would argue that if someone constructed an automaton that behaved exactly like I would in any given real-world situation—including novel situations, which Disney automatons can’t handle—then that automaton would, for all intents and purposes, be as conscious as I am. In fact, this automaton would, in fact, be a copy of me.
Let’s imagine that tonight, while you sleep, evil aliens replace everyone else in your home town (except for yourself, that is) with one of those perfect automatons. Would you be able to tell that this had occurred ? If so, how would you determine this ?
Perhaps I might not know the difference, but I am not the only observer here. Would the people replaced know the difference?
Well, presumably, the original people who were replace would indeed know the difference, as they watch helplessly from within the bubbling storage tanks where the evil aliens / wizards / whomever had put them prior to replacing them with the automatons.
The more interesting question is, would the automatons believe that they were the originals ? My claim is that, in order to emulate the originals perfectly with 100% accuracy—which is what this thought experiment requires—the automatons would have to believe that they were, in fact, original; and thus they would have to be conscious.
You could probably say, “ah-hah, sure the automatons may believe that they are the originals, but they’re wrong ! The originals are back on the mothership inside the storage vats !” This doesn’t sound like a very fruitful objection to me, however, since it doesn’t help you prove that the automatons are not conscious—merely that they aren’t composed of the same atoms as some other conscious beings (the ones inside the vats). So what, everyone is made of different atoms, you and I included.
Chimera writes: I’m a bit worried about people getting angry at me for not “getting it”
You are what? Worried? Worried is a conscious experience. A movie of you being worried does not show someone else being worried, it shows an unconscious image that looks like you being worried. An automaton built to duplicate your behavior when you are worried feels nothing, there is nothing (no consciousness) there to feel anything, but when you are doing that stuff people know and more importantly, you know how you feel and what it means to feel worried.
Imagine a world filled with disney animatronic robots all programmed to behave like real world people in our world behaved. Unless you think all those singing ghosts in the Haunted Mansion at disneyland are feeling happy and scared, then you can know what is being discussed here by imagining the difference between what images of people feel (nothing) and what actual people feel.
Good luck with this.
I would argue that if someone constructed an automaton that behaved exactly like I would in any given real-world situation—including novel situations, which Disney automatons can’t handle—then that automaton would, for all intents and purposes, be as conscious as I am. In fact, this automaton would, in fact, be a copy of me.
Let’s imagine that tonight, while you sleep, evil aliens replace everyone else in your home town (except for yourself, that is) with one of those perfect automatons. Would you be able to tell that this had occurred ? If so, how would you determine this ?
Perhaps I might not know the difference, but I am not the only observer here. Would the people replaced know the difference?
Fooling you by replacing me is one thing. Fooling me by replacing me is an entirely more difficult thing to do.
Well, presumably, the original people who were replace would indeed know the difference, as they watch helplessly from within the bubbling storage tanks where the evil aliens / wizards / whomever had put them prior to replacing them with the automatons.
The more interesting question is, would the automatons believe that they were the originals ? My claim is that, in order to emulate the originals perfectly with 100% accuracy—which is what this thought experiment requires—the automatons would have to believe that they were, in fact, original; and thus they would have to be conscious.
You could probably say, “ah-hah, sure the automatons may believe that they are the originals, but they’re wrong ! The originals are back on the mothership inside the storage vats !” This doesn’t sound like a very fruitful objection to me, however, since it doesn’t help you prove that the automatons are not conscious—merely that they aren’t composed of the same atoms as some other conscious beings (the ones inside the vats). So what, everyone is made of different atoms, you and I included.
deleted