If that was a proposed safety measure, I’d probably want the statistics of types of brain damage before using it. As proposed, it’s not actually clear: I feel like I can redefine the risks of brain damage by changing my personal definition of “me.”
As an example, with a very specific definition of parts of my identity, I could have a new dislike of fastfood be in the list of possible brain damages. I’m different (My approval of cheap and easy food causes me to eat it several times a week, that would be a large change), but that’s just not that bad.
And with a very broad definition of parts of my identity, I might have the only possible brain damages be horribly serious, such as “Either your relationship with your wife sours horribly, or you go on a killing rampage, or some other vast personality change that alienates everyone around you.”
And with bizzare irony, I might use the safety gear to HELP me, if I identify myself as only my worst traits, say only as a horribly depressed mess, then the brain damage has a significant chance of making me not identify as that.
Now clearly, this doesn’t feel like the intended use of the safety gear analogy. But I don’t feel like I HAVE a solid current identity anyway: My identity feels far to malleable based on circumstances. So I suppose the best thing for me to do is to read statistics about what kind of personality changes are expected.
But If for some reason Omega just comes up and says “You’re going to be mining on Alpha Centauri now, whether you want to or not, But I’ve set up safety gear which protects your life at the cost of your identity as per pinyaka’s example above, and you don’t have time to gather statistics. Do you want to use the safety gear?”
In that case, I’d pick no. If my sense of Identity is that flexible, anything that I didn’t identify as me would probably have a (from my perspective) near monstrous sense of ethics and I would consider it’s release comparably as bad as me dying.
Of course, that in itself is indicative of something about my personality. Biologically me with advanced ethics that are foreign to me now? Still me. Me with with monstrous ethics? Definitely not me.
If that was a proposed safety measure, I’d probably want the statistics of types of brain damage before using it. As proposed, it’s not actually clear: I feel like I can redefine the risks of brain damage by changing my personal definition of “me.”
As an example, with a very specific definition of parts of my identity, I could have a new dislike of fastfood be in the list of possible brain damages. I’m different (My approval of cheap and easy food causes me to eat it several times a week, that would be a large change), but that’s just not that bad.
And with a very broad definition of parts of my identity, I might have the only possible brain damages be horribly serious, such as “Either your relationship with your wife sours horribly, or you go on a killing rampage, or some other vast personality change that alienates everyone around you.”
And with bizzare irony, I might use the safety gear to HELP me, if I identify myself as only my worst traits, say only as a horribly depressed mess, then the brain damage has a significant chance of making me not identify as that.
Now clearly, this doesn’t feel like the intended use of the safety gear analogy. But I don’t feel like I HAVE a solid current identity anyway: My identity feels far to malleable based on circumstances. So I suppose the best thing for me to do is to read statistics about what kind of personality changes are expected.
But If for some reason Omega just comes up and says “You’re going to be mining on Alpha Centauri now, whether you want to or not, But I’ve set up safety gear which protects your life at the cost of your identity as per pinyaka’s example above, and you don’t have time to gather statistics. Do you want to use the safety gear?”
In that case, I’d pick no. If my sense of Identity is that flexible, anything that I didn’t identify as me would probably have a (from my perspective) near monstrous sense of ethics and I would consider it’s release comparably as bad as me dying.
Of course, that in itself is indicative of something about my personality. Biologically me with advanced ethics that are foreign to me now? Still me. Me with with monstrous ethics? Definitely not me.