Only people who in turn trust you not to mess with them, at least unless you bring them in under the same cryptographic protections under which you yourself are running on somebody else’s substrate. That’s an incredible amount of trust.
If you do bring them in under cryptographic protections, the resource penalties multiply. Your “home” is slowed down by some factor, and their “home within your home” is slowed down by that factor again. Where are you going to get the compute power? I’m not sure how this applies in the quantum case.
Also, once you’re trapped, what’s your source for a trustworthy copy of the person you’re inviting in (or of “them in their home”)? Are you sure you want the companions that your presumed tormentor chooses to provide to you?
Mentioned this in the other thread, but if you and I want to talk we probably (i) move near each other, (ii) communicate between our houses, (iii) negotiate on the shared environment (or e.g. how we should perceive each other).
Ideally if you’re dealing with a person you’d authenticate in the normal way (and part of the point of a house is to keep your private key secret).
I do think that in a world of digital people it could be more common to have attackers impersonating someone I know, but it’s kind of a different ballgame than an attacker controlling my inputs directly.
If “you” completely control your “home,” then it’s more natural to think of the home & occupant as a single agent, whose sensorium is its interface with an external world it doesn’t totally control—the “home” is just a sort of memory that can traversed or altered by a homunculus modeled on natural humans.
I think this is a reasonable way to look at it. But the point is that you identify with (and care morally about the inputs to) the homunculus. From the homunculus’ perspective you are just in a room talking with a friend. From the (home+occupant)’s perspective you are communicating very rapidly with your friend’s (home+occupant).
You can probably create your own companions. Maybe a modified fork of yourself?
There may also be an open source project that compiles validated and trustworthy digital companions (e.g., aligned AIs or uploads with long, verified track records of good behavior).
Depending on setup you can probably invite other people into your home.
Only people who in turn trust you not to mess with them, at least unless you bring them in under the same cryptographic protections under which you yourself are running on somebody else’s substrate. That’s an incredible amount of trust.
If you do bring them in under cryptographic protections, the resource penalties multiply. Your “home” is slowed down by some factor, and their “home within your home” is slowed down by that factor again. Where are you going to get the compute power? I’m not sure how this applies in the quantum case.
Also, once you’re trapped, what’s your source for a trustworthy copy of the person you’re inviting in (or of “them in their home”)? Are you sure you want the companions that your presumed tormentor chooses to provide to you?
Mentioned this in the other thread, but if you and I want to talk we probably (i) move near each other, (ii) communicate between our houses, (iii) negotiate on the shared environment (or e.g. how we should perceive each other).
Ideally if you’re dealing with a person you’d authenticate in the normal way (and part of the point of a house is to keep your private key secret).
I do think that in a world of digital people it could be more common to have attackers impersonating someone I know, but it’s kind of a different ballgame than an attacker controlling my inputs directly.
If “you” completely control your “home,” then it’s more natural to think of the home & occupant as a single agent, whose sensorium is its interface with an external world it doesn’t totally control—the “home” is just a sort of memory that can traversed or altered by a homunculus modeled on natural humans.
I think this is a reasonable way to look at it. But the point is that you identify with (and care morally about the inputs to) the homunculus. From the homunculus’ perspective you are just in a room talking with a friend. From the (home+occupant)’s perspective you are communicating very rapidly with your friend’s (home+occupant).
You can probably create your own companions. Maybe a modified fork of yourself?
There may also be an open source project that compiles validated and trustworthy digital companions (e.g., aligned AIs or uploads with long, verified track records of good behavior).