Enhancing yourself is great. I would gladly plug in extra memory and better indexing algorithms to my brain. Throw in rocket boosters and indestructible bones and I’ll empty my wallet. This I get and support. But what I have never understood is when people talk about uploading or transferring their conciousness. I wouldnt mind creating copies of myself, virtual or otherwise, but it wouldnt be me me. For some reason I have a very strong fear of continuity errors. Maybe you could fool me by replacing my fleshy brain part by part with mechanical hardware and then slowly outsourcing different functionalities part by part to a cloud based solution until I no longer have any physical presence. But I fear this will just lead to a day when I will have a sudden realisation that I am not actually me me and the following existential crisis will lead to unexpected outcomes.
This fear of continuity breaks is also why I would probably stay clear of any teleporters and the like in the future.
In case you haven’t read it: https://existentialcomics.com/comic/1
But overall I agree, this “feeling” is partially the reason why I’m a fan of the insert slightly-invasive mechanical components + outsource to external device strategy. As in, I do believe it’s the most practical since it seems to be roughly doable with non-singularity levels of technology, but it’s also the one where no continuation errors can easily happen.