If this is correct, the question is how corrupted they became while acquiring power, and whether they’ll over time become more generous, as the legitimate reasons for selfishness disappear in reality, and perhaps in their emotional makeup as a result.
Sure, maybe. Or maybe they’ll choose to drift farther and farther from human-like motivations themselves, subjected to whatever post-human experiences they’ll choose to indulge in or by directly self-modifying themselves. And even if they do become more compassionate later on, it may be already too late by then.
I think you’re imagining whoever nabs the AGI controls as continuing to live among humanity for a while, with the planet still mostly arranged the way it is now, and gradually learning empathy as they interact with people.
I expect things to get much, much weirder, basically immediately.
It requires little benevolence to help people when it’s trivially easy for the people in charge to do it.
It’s also easy to forget about the existence of people whom you don’t need and who can’t force you to remember them. And just tile over them.
Not maliciously. Not even in a way that chooses to be actively uncaring. Just by… not particularly thinking about them, as you live in your post-human simulation spaces or exploring the universe or whatever. And when you do come back to check, you notice the AI has converted all of them to computronium, because at some point you’d value-drifted away from caring about them, and the AI noticed that. And true enough, you’re not sad to discover they’re all gone. Just an “oh well”.
Sure, maybe. Or maybe they’ll choose to drift farther and farther from human-like motivations themselves, subjected to whatever post-human experiences they’ll choose to indulge in or by directly self-modifying themselves. And even if they do become more compassionate later on, it may be already too late by then.
I think you’re imagining whoever nabs the AGI controls as continuing to live among humanity for a while, with the planet still mostly arranged the way it is now, and gradually learning empathy as they interact with people.
I expect things to get much, much weirder, basically immediately.
It’s also easy to forget about the existence of people whom you don’t need and who can’t force you to remember them. And just tile over them.
Not maliciously. Not even in a way that chooses to be actively uncaring. Just by… not particularly thinking about them, as you live in your post-human simulation spaces or exploring the universe or whatever. And when you do come back to check, you notice the AI has converted all of them to computronium, because at some point you’d value-drifted away from caring about them, and the AI noticed that. And true enough, you’re not sad to discover they’re all gone. Just an “oh well”.
It’s the ethical analogue to Oversight Misses 100% of Thoughts [You] Don’t Think.
Like, you know all the caricatures about out-of-touch rich people? That’ll be that, but up to eleven.
Agreed, 100%.