But flip one bit and do you suddenly have two people? Can’t be right.
Why not? Imagine that bit is the memory/knowledge of which copy they are. After the copying, each copy naturally is curious what happened, and recall that bit. Now, if you had 1 person appearing in 2 places, it should be that every thought would be identical, right? Yet one copy will think ‘1!‘; the other will think ‘0!’. As 1 != 0, this is a contradiction.
Not enough of a contradiction? Imagine further that the original had resolved to start thinking about hot sexy Playboy pinups if it was 1, but to think about all his childhood sins if 0. Or he decides quite arbitrarily to become a Sufi Muslim if 0, and a Mennonite if 1. Or… (insert arbitrarily complex mental processes contingent on that bit).
At some point you will surely admit that we now have 2 people and not just 1; but the only justifiable step at which to say they are 2 and not 1 is the first difference.
At some point you will surely admit that we now have 2 people and not just 1
Actually I won’t. While I grok your approach completely, I’d rather say my concept of ‘an individual’ breaks down once I have two minds with one bit’s difference, or two identical minds, or any of these borderline cases we’re so fond of.
Say I have two optimisers with one bit’s difference. If that bit means one copy converts to Sufism and the other to Mennonism, then sure, two different people. If that one bit is swallowed up in later neural computations due to the coarse-grained-ness of the wetware, then we’re back to one person since the two are, once again, functionally identical. Faced with contradictions like that, I’m expecting our idea of personal identity to go out the window pretty fast once tech like this actually arrives. Greg Egan’s Diaspora pretty much nails this for me, have a look.
All your ‘contradictions’ go out the window once you let go of the idea of a mind as an indivisible unit. If our concept of identity is to have any value (and it really has to) then we need to learn to think more like reality, which doesn’t care about things like ‘one bit’s difference’.
If that one bit is swallowed up in later neural computations due to the coarse-grained-ness of the wetware, then we’re back to one person since the two are, once again, functionally identical.
Ack. So if I understand you right, your alternative to bit-for-bit identity is to loosen it to some sort of future similarity, which can depend on future actions and outcomes; or in other words, there’s a radical indeterminacy about even the minds in our example: are they same or are they different, who knows, it depends on whether the Sufism comes out in the wash! Ask me later; but then again, even then I won’t be sure whether those 2 were the same when we started them running (always in motion the future is).
That seems like quite a bullet to bite, and I wonder whether you can hold to any meaningful ‘individual’, whether the difference be bit-wise or no. Even 2 distant non-borderline mindsmight grow into each other.
I wonder whether you can hold to any meaningful ‘individual’, whether the difference be bit-wise or no.
Indeed, that’s what I’m driving at.
Harking back to my earlier comment, changing a single bit and suddenly having a whole new person is where my problem arises. If you change that bit back, are you back to one person? I might not be thinking hard enough, but my intuition doesn’t accept that. With that in mind, I prefer to bite that bullet than talk about degrees of person-hood.
If you change that bit back, are you back to one person? I might not be thinking hard enough, but my intuition doesn’t accept that.
Here’s an intuition for you: you take the number 5 and add 1 to it; then you subtract 1 from it; don’t you have what you started with?
With that in mind, I prefer to bite that bullet than talk about degrees of person-hood.
Well, I can’t really argue with that. As long as you realize you’re biting that bullet, I think we’re still in a situation where it’s just dueling intuitions. (Your intuition says one thing, mine another.)
What if I hack & remove $100 from your bank account. Are you just as wealthy as you were before, because you haven’t looked? If the 2 copies simply haven’t looked or otherwise are still unaware, that doesn’t mean they are the same. Their possible futures diverge.
And, sure, it’s possible they might never realize—we could merge them back before they notice, just as I could restore the money before the next time you checked, but I think we would agree that I still committed a crime (theft) with your money; why couldn’t we feel that there was a crime (murder) in the merging?
Huh? My point is a bitflip in a non conscious part, before it influences any of the conscious processing, well, if prior to that bit flip you would have said there was only one being, then I’d say after that they’d still not yet diverged. Or at least, not entirely.
As far as a merging, well, in that case who, precisely, is the one that’s being killed?
So only anything in immediate consciousness counts? Fine, let’s remove all of the long-term memories of one of the copies—after all, he’s not thinking about his childhood...
As far as a merging, well, in that case who, precisely, is the one that’s being killed?
Obviously whichever one isn’t there afterwards; if the bit is 1, then 0 got killed off & vice versa. If we erase both copies and replace with the original, then both were killed.
I’d have to say that IF two (equivalent) instances of a mind count as “one mind”, then removing an unaccessed data store does not change that for the duration that the effect of the removal doesn’t propagate directly or indirectly to the conscious bits.
If one then restores that data store before anything was noticed regarding it being missing, then, conditional on the assumption that IF the two instances originally only counted as one being, then.… so they remain.
EDIT: to clarify, though… my overall issue here is that I think we may be effectively implicitly treating conscious agents as irreducible entities. If we’re ever going to find an actual proper reduction of consciousness, well, we probably need to ask ourselves stuff like “what if two agents are bit for bit identical… except for these couple of bits here? What if they were completely identical? Is the couple bit difference enough that they might as well be completely different?” etc...
I think I’d have to say still “Nothing of significance happened until memory access occurs”
Until then, well, how’s it any different then stealing your books… and then replacing them before you notice?
Now, as I said, we probably ought be asking questions like “what if in the actual “conscious processing” part of the agent, a few bits were changed in one instance… but just that… so initially, before it propagates enough to completely diverge… what should we say? To say it completely changes everything instantly, well… that seems too much like saying “conscious agents are irreducible”, so...
(just to clarify: I’m more laying out a bit of my confusion here rather than anything else, plus noting that we seem to have been, in our quest to find reductions for aspects of consciousness, implicitly treating agents as irreducible in certain ways)
(just to clarify: I’m more laying out a bit of my confusion here rather than anything else, plus noting that we seem to have been, in our quest to find reductions for aspects of consciousness, implicitly treating agents as irreducible in certain ways)
Indeed. It’s not obvious what we can reduce agents down further into without losing agents entirely; bit-for-bit identity is at least clear in a few situations.
(To continue the example—if we see the unaccessed memory as being part of the agent, then obviously we can’t mess with it without changing the agent; but if we intuitively see it as like the agent having Internet access and the memory being a webpage, then we wouldn’t regard it as part of its identity.)
What if I hack & remove $100 from your bank account. Are you just as wealthy as you were before, because you haven’t looked?
Standard Dispute. If wealthy = same amount of money in the account, no. If wealthy = how rich you judge yourself to be. The fact that ‘futures diverge’ is irrelevant up until the moment those two different pieces of information have causal contact with the brain. Until that point, yes, they are ’the same
Why not? Imagine that bit is the memory/knowledge of which copy they are. After the copying, each copy naturally is curious what happened, and recall that bit. Now, if you had 1 person appearing in 2 places, it should be that every thought would be identical, right? Yet one copy will think ‘1!‘; the other will think ‘0!’. As 1 != 0, this is a contradiction.
Not enough of a contradiction? Imagine further that the original had resolved to start thinking about hot sexy Playboy pinups if it was 1, but to think about all his childhood sins if 0. Or he decides quite arbitrarily to become a Sufi Muslim if 0, and a Mennonite if 1. Or… (insert arbitrarily complex mental processes contingent on that bit).
At some point you will surely admit that we now have 2 people and not just 1; but the only justifiable step at which to say they are 2 and not 1 is the first difference.
Actually I won’t. While I grok your approach completely, I’d rather say my concept of ‘an individual’ breaks down once I have two minds with one bit’s difference, or two identical minds, or any of these borderline cases we’re so fond of.
Say I have two optimisers with one bit’s difference. If that bit means one copy converts to Sufism and the other to Mennonism, then sure, two different people. If that one bit is swallowed up in later neural computations due to the coarse-grained-ness of the wetware, then we’re back to one person since the two are, once again, functionally identical. Faced with contradictions like that, I’m expecting our idea of personal identity to go out the window pretty fast once tech like this actually arrives. Greg Egan’s Diaspora pretty much nails this for me, have a look.
All your ‘contradictions’ go out the window once you let go of the idea of a mind as an indivisible unit. If our concept of identity is to have any value (and it really has to) then we need to learn to think more like reality, which doesn’t care about things like ‘one bit’s difference’.
Ack. So if I understand you right, your alternative to bit-for-bit identity is to loosen it to some sort of future similarity, which can depend on future actions and outcomes; or in other words, there’s a radical indeterminacy about even the minds in our example: are they same or are they different, who knows, it depends on whether the Sufism comes out in the wash! Ask me later; but then again, even then I won’t be sure whether those 2 were the same when we started them running (always in motion the future is).
That seems like quite a bullet to bite, and I wonder whether you can hold to any meaningful ‘individual’, whether the difference be bit-wise or no. Even 2 distant non-borderline mindsmight grow into each other.
Indeed, that’s what I’m driving at.
Harking back to my earlier comment, changing a single bit and suddenly having a whole new person is where my problem arises. If you change that bit back, are you back to one person? I might not be thinking hard enough, but my intuition doesn’t accept that. With that in mind, I prefer to bite that bullet than talk about degrees of person-hood.
Here’s an intuition for you: you take the number 5 and add 1 to it; then you subtract 1 from it; don’t you have what you started with?
Well, I can’t really argue with that. As long as you realize you’re biting that bullet, I think we’re still in a situation where it’s just dueling intuitions. (Your intuition says one thing, mine another.)
The downside is that it’s not really that reductionistic.
What if you flip a bit in part of an offline memory store that you’re not consciously thinking about at the time or such?
What if I hack & remove $100 from your bank account. Are you just as wealthy as you were before, because you haven’t looked? If the 2 copies simply haven’t looked or otherwise are still unaware, that doesn’t mean they are the same. Their possible futures diverge.
And, sure, it’s possible they might never realize—we could merge them back before they notice, just as I could restore the money before the next time you checked, but I think we would agree that I still committed a crime (theft) with your money; why couldn’t we feel that there was a crime (murder) in the merging?
Huh? My point is a bitflip in a non conscious part, before it influences any of the conscious processing, well, if prior to that bit flip you would have said there was only one being, then I’d say after that they’d still not yet diverged. Or at least, not entirely.
As far as a merging, well, in that case who, precisely, is the one that’s being killed?
So only anything in immediate consciousness counts? Fine, let’s remove all of the long-term memories of one of the copies—after all, he’s not thinking about his childhood...
Obviously whichever one isn’t there afterwards; if the bit is 1, then 0 got killed off & vice versa. If we erase both copies and replace with the original, then both were killed.
I’d have to say that IF two (equivalent) instances of a mind count as “one mind”, then removing an unaccessed data store does not change that for the duration that the effect of the removal doesn’t propagate directly or indirectly to the conscious bits.
If one then restores that data store before anything was noticed regarding it being missing, then, conditional on the assumption that IF the two instances originally only counted as one being, then.… so they remain.
EDIT: to clarify, though… my overall issue here is that I think we may be effectively implicitly treating conscious agents as irreducible entities. If we’re ever going to find an actual proper reduction of consciousness, well, we probably need to ask ourselves stuff like “what if two agents are bit for bit identical… except for these couple of bits here? What if they were completely identical? Is the couple bit difference enough that they might as well be completely different?” etc...
And if we restore a different long-term memory instead?
I think I’d have to say still “Nothing of significance happened until memory access occurs”
Until then, well, how’s it any different then stealing your books… and then replacing them before you notice?
Now, as I said, we probably ought be asking questions like “what if in the actual “conscious processing” part of the agent, a few bits were changed in one instance… but just that… so initially, before it propagates enough to completely diverge… what should we say? To say it completely changes everything instantly, well… that seems too much like saying “conscious agents are irreducible”, so...
(just to clarify: I’m more laying out a bit of my confusion here rather than anything else, plus noting that we seem to have been, in our quest to find reductions for aspects of consciousness, implicitly treating agents as irreducible in certain ways)
Indeed. It’s not obvious what we can reduce agents down further into without losing agents entirely; bit-for-bit identity is at least clear in a few situations.
(To continue the example—if we see the unaccessed memory as being part of the agent, then obviously we can’t mess with it without changing the agent; but if we intuitively see it as like the agent having Internet access and the memory being a webpage, then we wouldn’t regard it as part of its identity.)
Standard Dispute. If wealthy = same amount of money in the account, no. If wealthy = how rich you judge yourself to be. The fact that ‘futures diverge’ is irrelevant up until the moment those two different pieces of information have causal contact with the brain. Until that point, yes, they are ’the same