I’ve had existential crises thinking about such things. Stuff like living forever or having my brain upgraded beyond recognition scare me, for reasons I can’t quite put into words.
I’m comforted by the argument that it won’t happen overnight. We will probably gradually transition into such a world and it won’t feel so weird and shocking. And if we get it right, the AI will ask us what we want, and present us with arguments for and against our options, so we can decide what we actually want. Not just get stuck in a shitty future we wouldn’t want.
I’ve had existential crises thinking about such things. Stuff like living forever or having my brain upgraded beyond recognition scare me, for reasons I can’t quite put into words.
I’m comforted by the argument that it won’t happen overnight. We will probably gradually transition into such a world and it won’t feel so weird and shocking. And if we get it right, the AI will ask us what we want, and present us with arguments for and against our options, so we can decide what we actually want. Not just get stuck in a shitty future we wouldn’t want.