The question about mind uploading feels a bit to me like, “would you be against 2 + 2 being 5, if it were possible?” I think it couldn’t be possible even in theory.
I think brain emulation could be possible though, and you could have essentially human minds running on a machine. I wouldn’t necessarily be against that, or even artificial intelligences that we are confident possess whatever is valuable about human minds (consciousness plus some other stuff probably). But as a biological human, I also have a vested interest in making sure if this replacement happens, it happens in a way that doesn’t screw over existing biological humans. In particular, if we give a bunch of rights to machines, we dilute our rights in a way that could be very bad for us.
It’s interesting to me that you think mind uploading is impossible but brain emulation could be possible. I was using those words to refer to the same thing! I assume what you think here is that moving a mind from a biological to digital substrate is impossible but copying one is not? To be honest, I’m confused about how consciousness works and don’t really have much of a solid opinion about this.
Anyway, I agree that we need a system which protects existing biological life if we’re going to make lots of digital minds which we ought to grant rights. We also need those minds to respect that system, which requires solving technical alignment at least in the case of nonhuman artifical intelligences. I don’t agree that all entities which can self-copy and have moral value should be destroyed, which what I thought your inital claim was, but given your clarification I don’t think we have quite that much of a disagreement on this topic.
Yes, for me the problem is moving a mind from a biological substrate to a digital one. It’s hard for me to imagine you’re actually moving the original, not just making a copy. Maybe there’s some way to do it, so I’m not totally confident.
I also imagine it as making a copy, but I’d also expect that people who want their mind uploaded would know of this and would hold their identity such that they consider the copy(ies) to be themself as well. I’m not sure I’d endorse this view of identity,[1] but I don’t really have any issues with people taking it. Does your view on “the original” break with this, or would you just then consider the copy similarly to how you would whole brain emulation? (or something else)
I’m not sure I really endorse any view of identity or think it’s a coherent concept, but at the very least I think making a copy of something doesn’t make something that is that thing.
The question about mind uploading feels a bit to me like, “would you be against 2 + 2 being 5, if it were possible?” I think it couldn’t be possible even in theory.
I think brain emulation could be possible though, and you could have essentially human minds running on a machine. I wouldn’t necessarily be against that, or even artificial intelligences that we are confident possess whatever is valuable about human minds (consciousness plus some other stuff probably). But as a biological human, I also have a vested interest in making sure if this replacement happens, it happens in a way that doesn’t screw over existing biological humans. In particular, if we give a bunch of rights to machines, we dilute our rights in a way that could be very bad for us.
It’s interesting to me that you think mind uploading is impossible but brain emulation could be possible. I was using those words to refer to the same thing! I assume what you think here is that moving a mind from a biological to digital substrate is impossible but copying one is not? To be honest, I’m confused about how consciousness works and don’t really have much of a solid opinion about this.
Anyway, I agree that we need a system which protects existing biological life if we’re going to make lots of digital minds which we ought to grant rights. We also need those minds to respect that system, which requires solving technical alignment at least in the case of nonhuman artifical intelligences. I don’t agree that all entities which can self-copy and have moral value should be destroyed, which what I thought your inital claim was, but given your clarification I don’t think we have quite that much of a disagreement on this topic.
Yes, for me the problem is moving a mind from a biological substrate to a digital one. It’s hard for me to imagine you’re actually moving the original, not just making a copy. Maybe there’s some way to do it, so I’m not totally confident.
I also imagine it as making a copy, but I’d also expect that people who want their mind uploaded would know of this and would hold their identity such that they consider the copy(ies) to be themself as well. I’m not sure I’d endorse this view of identity,[1] but I don’t really have any issues with people taking it. Does your view on “the original” break with this, or would you just then consider the copy similarly to how you would whole brain emulation? (or something else)
Or at least, I think it would be very risky to get rid of my biological self based on such a view
I’m not sure I really endorse any view of identity or think it’s a coherent concept, but at the very least I think making a copy of something doesn’t make something that is that thing.