Would you agree that if it’s possible in principle to build a silicon replica of a brain at whatever the relevant level of abstraction for consciousness is (whether coarse-grained functional level, neuron-level, sub-neuron level or whatever) then the silicon replica would actually be conscious?
I will say I do believe, in general, that we simply need a much better understanding of what “consciousness” means before we can reason more precisely about these topics. Certain ontologies can assign short encodings to concepts that are either ultimately confused or at the very least don’t carve reality at the joints.
We typically generalize from one example when it comes to consciousness and subjectivity: “we’re conscious, so therefore there must be some natural concept consciousness refers to” is how the argument goes. And we reject solipsism because we look at other human beings and notice that they act similarly to us and seem to possess the same internal structure as us, so we think we can safely say that they must also have an “internal life” of subjectivity, just like us. That’s all fine and good. But when we move outside of that narrow, familiar domain and try to reason about stuff our intuition was not built for, that’s when the tails come apart and stuff can get very weird.
But overall, I don’t think we disagree about too much here. I wouldn’t talk about this topic in the terms you chose, and perhaps this does reflect some delta between us, but it’s probably not a major point of disagreement.
As to the additional question of identity, namely whether that replica is the same consciousness as that which it’s meant to replicate… I’d still say no. But that doesn’t seem to be what you’re focused on here.
Yeah, I think this is pretty likely.[1]
I will say I do believe, in general, that we simply need a much better understanding of what “consciousness” means before we can reason more precisely about these topics. Certain ontologies can assign short encodings to concepts that are either ultimately confused or at the very least don’t carve reality at the joints.
We typically generalize from one example when it comes to consciousness and subjectivity: “we’re conscious, so therefore there must be some natural concept consciousness refers to” is how the argument goes. And we reject solipsism because we look at other human beings and notice that they act similarly to us and seem to possess the same internal structure as us, so we think we can safely say that they must also have an “internal life” of subjectivity, just like us. That’s all fine and good. But when we move outside of that narrow, familiar domain and try to reason about stuff our intuition was not built for, that’s when the tails come apart and stuff can get very weird.
But overall, I don’t think we disagree about too much here. I wouldn’t talk about this topic in the terms you chose, and perhaps this does reflect some delta between us, but it’s probably not a major point of disagreement.
As to the additional question of identity, namely whether that replica is the same consciousness as that which it’s meant to replicate… I’d still say no. But that doesn’t seem to be what you’re focused on here.