Why should we care whether they believe a machine can manifest a property if they cannot even begin to explain what that property is or what it means?
“I believe that AIs can be fitzgoanth.” Well, so what?
Demanding that we produce an explanation for a thing is pointless if we cannot first show that the thing exists, or even give a coherent explanation for what properties the thing is supposed to have.
“I believe that AIs can be fitzgoanth.” Well, so what?
Demanding that we produce an explanation for a thing is pointless if we cannot first show that the thing exists, or even give a coherent explanation for what properties the thing is supposed to have.