If you have many different (and conflicting, in that they demand undivided attention) interests: if it was possible, would copying yourself in order to pursue them more efficiently satisfy you?
One copy gets to learn drawing, another one immerses itself in mathematics & physics, etc. In time, they can grow very different.
(Is this scenario much different to you than simply having children?)
I wouldn’t have problems copying myself as long as I could merge the copies afterwards. However, it might not be possible to have a merge operation for human level systems that both preserves information and preserves sanity. E.g. if one copy started studying philosophy and radically changed its world views from the original, how do you merge this copy back into the original without losing information?
David Brin’s novel Kiln People has this “merging back” idea, with cheap copies, using clay for a lot of the material and running on a hydrogen based metabolism so they are very short lived (hours to weeks, depending on $$) and have to merge back relatively soon in order to keep continuity of consciousness through their long lived original. Lots of fascinating practical economic, ethical, social, military, and political details are explored while a noir detective story happens in the foreground.
If you don’t have the ability to merge, would the copies get equal rights as the original? Or would the original control all the resources and the copies get treated as second class citizens? If the copies were second class citizens, I would probably not fork because this would result in slavery.
If the copies do get equal rights, how do you plan to allocate resources that you had before forking such as wealth and friends? If I split the wealth down the middle, I would probably be OK with the lack of merging. However, I’m not sure how I would divide up social relationships between the copy and the original. If both the original and the copy had to reduce their financial and social capital by half, this might have a net negative utility.
If the goal is to just learn a new skill such as drawing, a more efficient solution might involving uploading yourself without copying yourself and then running the upload faster than realtime. I.e. the upload thinks it has spent a year learning a new skill but only a day has gone by in the real world. However, this trick won’t work if the goal involves interacting with others unless they are also willing to run faster than realtime.
Do what e.g. Mercurial does: report that the copies are too different for automatic merge, and punt the problem back to the user.
In other words, you are right that there is no solution in the general case, but that should not necessarily deter us from looking for a solution that works in 90% of cases.
As this sounds like a computer assisted scenario, I would like the ability to append memories while sleeping. Wake up and have access to the memories of the copy. This would not necessarily include full proficiency as I suspect that muscle memory may not get copied.
Copying has at best zero utility (as regards interests): each copy only indulges in one interest, and I anticipate being only one copy, even if I don’t know in advance which one.
How is having children at all similar? 1) children would have different interests; 2) I cannot control (precommit) future children; 3) raising children would be for me a huge negative utility—both emotionally and resource-wise.
Copying has at best zero utility (as regards interests)
This is not true for me. I care about my ideas beyond my own desire to implement them. If I knew there was a passionate and capable person willing to take over some of my ideas (which I’d otherwise not have time for), I’d jump on the opportunity.
Doubly so if the other person was a copy of me, in which case I’d not only have a guarantee on competence, but assurance that the person would be able to relate the story and product to me afterwards (and possibly share the profit).
Doubly so if the other person was a copy of me, in which case I’d not only have a guarantee on competence, but assurance that the person would be able to relate the story and product to me afterwards (and possibly share the profit).
Interestingly, now that you bring this up, I’m not at all certain that I’d be able to communicate especially effectively with a copy of myself. Probably better than with a randomly selected person, but perhaps not as well as I might hope.
Interestingly, now that you bring this up, I’m not at all certain that I’d be able to communicate especially effectively with a copy of myself. Probably better than with a randomly selected person, but perhaps not as well as I might hope.
I think communication would start out good and become amazing over time. I don’t communicate with myself completely in English, there’s a lot of thoughts that go through unencoded. Having a copy of myself to talk to would force us to encode those raw thoughts as best as possible. This isn’t necessarily easy but I think the really difficulty part would already be behind us, namely having the same core thoughts.
I rather say higher level functions is excessively redundant. Then there are coordination problems, competition for shared resources (e.g. money, sexual partner), possibly divergence of near- and far-term goals, relatively low in-group communication speed, possibly less number of cross-domain-of-knowledge insights.
I think red75 meant rebuilding yourself into a more “multi-threaded” being. I’m not sure I would want to go in that direction, though—it’s hard to imagine what the result would feel like, it probably couldn’t even be called conscious in the human sense, but somehow multiply-conscious...
Yes, something like that. But I don’t think that consciousness of such being will be dramatically different, because it still should contain “central executive” that still coordinates overall behavior of that being and still controls direction and distribution of attention that is however much more fine-grained than human’s one.
If you have many different (and conflicting, in that they demand undivided attention) interests: if it was possible, would copying yourself in order to pursue them more efficiently satisfy you?
One copy gets to learn drawing, another one immerses itself in mathematics & physics, etc. In time, they can grow very different.
(Is this scenario much different to you than simply having children?)
I wouldn’t have problems copying myself as long as I could merge the copies afterwards. However, it might not be possible to have a merge operation for human level systems that both preserves information and preserves sanity. E.g. if one copy started studying philosophy and radically changed its world views from the original, how do you merge this copy back into the original without losing information?
David Brin’s novel Kiln People has this “merging back” idea, with cheap copies, using clay for a lot of the material and running on a hydrogen based metabolism so they are very short lived (hours to weeks, depending on $$) and have to merge back relatively soon in order to keep continuity of consciousness through their long lived original. Lots of fascinating practical economic, ethical, social, military, and political details are explored while a noir detective story happens in the foreground.
I recommend it :-)
I agree, I don’t think merge is possible in this scenario. I still see some gains, though (especially when communication is possible):
I (the copy that does X) am happy because I do what I wanted.
I (the other copies) am happy because I partly identify with the other copy (as I would be proud of my child/student?)
I (all copies) get results I wanted (research, creative, or even personal insights if the first copy is able to communicate them)
If you don’t have the ability to merge, would the copies get equal rights as the original? Or would the original control all the resources and the copies get treated as second class citizens? If the copies were second class citizens, I would probably not fork because this would result in slavery.
If the copies do get equal rights, how do you plan to allocate resources that you had before forking such as wealth and friends? If I split the wealth down the middle, I would probably be OK with the lack of merging. However, I’m not sure how I would divide up social relationships between the copy and the original. If both the original and the copy had to reduce their financial and social capital by half, this might have a net negative utility.
If the goal is to just learn a new skill such as drawing, a more efficient solution might involving uploading yourself without copying yourself and then running the upload faster than realtime. I.e. the upload thinks it has spent a year learning a new skill but only a day has gone by in the real world. However, this trick won’t work if the goal involves interacting with others unless they are also willing to run faster than realtime.
Tentatively—there’s be a central uberperson which wouldn’t be that much like a single human being.
If I had reason to think it was safe, I’d really like to live that way.
Do what e.g. Mercurial does: report that the copies are too different for automatic merge, and punt the problem back to the user.
In other words, you are right that there is no solution in the general case, but that should not necessarily deter us from looking for a solution that works in 90% of cases.
That sounds (to me) better than having children, but not as good as living longer.
Sounds wonderful. Divide and conquer.
As this sounds like a computer assisted scenario, I would like the ability to append memories while sleeping. Wake up and have access to the memories of the copy. This would not necessarily include full proficiency as I suspect that muscle memory may not get copied.
Copying has at best zero utility (as regards interests): each copy only indulges in one interest, and I anticipate being only one copy, even if I don’t know in advance which one.
How is having children at all similar? 1) children would have different interests; 2) I cannot control (precommit) future children; 3) raising children would be for me a huge negative utility—both emotionally and resource-wise.
This is not true for me. I care about my ideas beyond my own desire to implement them. If I knew there was a passionate and capable person willing to take over some of my ideas (which I’d otherwise not have time for), I’d jump on the opportunity.
Doubly so if the other person was a copy of me, in which case I’d not only have a guarantee on competence, but assurance that the person would be able to relate the story and product to me afterwards (and possibly share the profit).
Interestingly, now that you bring this up, I’m not at all certain that I’d be able to communicate especially effectively with a copy of myself. Probably better than with a randomly selected person, but perhaps not as well as I might hope.
What makes you reach that conclusion?
I think communication would start out good and become amazing over time. I don’t communicate with myself completely in English, there’s a lot of thoughts that go through unencoded. Having a copy of myself to talk to would force us to encode those raw thoughts as best as possible. This isn’t necessarily easy but I think the really difficulty part would already be behind us, namely having the same core thoughts.
I think people can feel a sense of accomplishment when their child achieved something they wanted but never got around to.
Waste of processing power. Having dozens of focuses of attention and corresponding body/brain construction is more efficient.
Because basic functions are being repeated?
I rather say higher level functions is excessively redundant. Then there are coordination problems, competition for shared resources (e.g. money, sexual partner), possibly divergence of near- and far-term goals, relatively low in-group communication speed, possibly less number of cross-domain-of-knowledge insights.
Surely you jest.
TVTropes warning.
What’s the difference between a copy of yourself and an extra “body/brain construction”?
I think red75 meant rebuilding yourself into a more “multi-threaded” being. I’m not sure I would want to go in that direction, though—it’s hard to imagine what the result would feel like, it probably couldn’t even be called conscious in the human sense, but somehow multiply-conscious...
Yes, something like that. But I don’t think that consciousness of such being will be dramatically different, because it still should contain “central executive” that still coordinates overall behavior of that being and still controls direction and distribution of attention that is however much more fine-grained than human’s one.