that anything we build capable of passing the (full) Turing Test would have to be conscious
I believe Eliezer doesn’t agree with that last one, and has talked about building an AI who isn’t conscious.
Also, consider the following hypothetical: I get really drunk and/or take Ambien and black out at 2 am. I have no conscious experience or memory of the time between 2 am and 3 am, but during that time you have a (loud and drunken) conversation with me. Or maybe in my drunken state I sit at my computer and manage to instant message without being conscious of it, and the person at the other end is convinced I’m human and not a computer program. Counterexample?
Well, I think we can all agree that it’s possible for a non-conscious person (or program or whatever) to be mistaken for a conscious being.
However, there are several objections I can make to this scenario being considered a counterexample:
(1) How do you know you’re not conscious? Just because you don’t remember it the next day doesn’t mean you don’t have any awareness at the time.
(2) In the Turing test the judge is supposed to be ‘on the look-out’ for which of its two subjects seems less able to respond adequately to their questions. And one of the subjects is presumed to be a healthy, sober human. So unless you think the judge would be unable to distinguish a drunken, unconscious conversation from a normal, sober one, you would presumably fail the Turing test.
I believe Eliezer doesn’t agree with that last one, and has talked about building an AI who isn’t conscious.
Also, consider the following hypothetical: I get really drunk and/or take Ambien and black out at 2 am. I have no conscious experience or memory of the time between 2 am and 3 am, but during that time you have a (loud and drunken) conversation with me. Or maybe in my drunken state I sit at my computer and manage to instant message without being conscious of it, and the person at the other end is convinced I’m human and not a computer program. Counterexample?
Well, I think we can all agree that it’s possible for a non-conscious person (or program or whatever) to be mistaken for a conscious being.
However, there are several objections I can make to this scenario being considered a counterexample:
(1) How do you know you’re not conscious? Just because you don’t remember it the next day doesn’t mean you don’t have any awareness at the time.
(2) In the Turing test the judge is supposed to be ‘on the look-out’ for which of its two subjects seems less able to respond adequately to their questions. And one of the subjects is presumed to be a healthy, sober human. So unless you think the judge would be unable to distinguish a drunken, unconscious conversation from a normal, sober one, you would presumably fail the Turing test.