Most of us probably think that an AI simulating a mind and torturing it is bad. We are not going to draw a hard line between that and what humans can do. Instead, it seems to me that a human simulating a (very simple) mind, and torturing it is also bad, just very very much less bad. The question is how much less bad is it? The answer is we don’t know. It is probably a lot less bad than actually torturing an insect, but how do we know that? It seems we need some sort or metric that measures how much we care about minds. That metric probably measures not just how complex the minds are, but also how similar they are to humans.
I think the probability is small that after sufficient reflection I will think simulating torture in my mind is a huge evil, but maybe it is not negligible.
Again, I don’t actually believe that humans simulating torture is bad, but I can play devils advocate and answer your questions.
You don’t have to think of a specific person who exists outside your mind. You just have part of your brain (not necessarily a conscious part) put itself in the shoes of the tortured person. Some part of your brain has to feel the pain of the torture. Maybe this is not possible if you have not gone through the pain of torture in the past. I do not think there is harm in thinking about someone being tortured, if you do not simulate the tortured person’s feelings. However, it might be that our brains automatically think about our approximation of what the tortured person is feeling.
Thanks for the additional information. I’m going to trust that you and everyone else who considers this a remote possibility is just way smarter than me. The whole scenario is so abstract and odd that it makes me dizzy...
Most of us probably think that an AI simulating a mind and torturing it is bad. We are not going to draw a hard line between that and what humans can do. Instead, it seems to me that a human simulating a (very simple) mind, and torturing it is also bad, just very very much less bad. The question is how much less bad is it? The answer is we don’t know. It is probably a lot less bad than actually torturing an insect, but how do we know that? It seems we need some sort or metric that measures how much we care about minds. That metric probably measures not just how complex the minds are, but also how similar they are to humans.
I think the probability is small that after sufficient reflection I will think simulating torture in my mind is a huge evil, but maybe it is not negligible.
Again, I don’t actually believe that humans simulating torture is bad, but I can play devils advocate and answer your questions.
You don’t have to think of a specific person who exists outside your mind. You just have part of your brain (not necessarily a conscious part) put itself in the shoes of the tortured person. Some part of your brain has to feel the pain of the torture. Maybe this is not possible if you have not gone through the pain of torture in the past. I do not think there is harm in thinking about someone being tortured, if you do not simulate the tortured person’s feelings. However, it might be that our brains automatically think about our approximation of what the tortured person is feeling.
Thanks for the additional information. I’m going to trust that you and everyone else who considers this a remote possibility is just way smarter than me. The whole scenario is so abstract and odd that it makes me dizzy...