Yeah, on further thought I think you’re right. This is pretty pessimistic then, AI companies will find it easy to align AIs to money interests, and the rest of us will be in a “natives vs the East India Company” situation. More time to spend on alignment then matters only if some companies actually try to align AIs to something good instead, and I’m not sure any companies will do that.
This is also my view of the situation, as well, and is a big portion of the reason why solving AI alignment, which reduces existential risk a lot, is non-trivially likely without further political reforms I don’t expect to lead to dystopian worlds (from my values).
Yeah, any small group of humans seizing unprecedented control over the entire world seems like a bad gamble to take, even if they start off seeming like decent people.
I’m currently hoping we can figure some kind of new governance solution for managing decentralized power while achieving adequate safety inspections.
Yeah, on further thought I think you’re right. This is pretty pessimistic then, AI companies will find it easy to align AIs to money interests, and the rest of us will be in a “natives vs the East India Company” situation. More time to spend on alignment then matters only if some companies actually try to align AIs to something good instead, and I’m not sure any companies will do that.
This is also my view of the situation, as well, and is a big portion of the reason why solving AI alignment, which reduces existential risk a lot, is non-trivially likely without further political reforms I don’t expect to lead to dystopian worlds (from my values).
Yeah, any small group of humans seizing unprecedented control over the entire world seems like a bad gamble to take, even if they start off seeming like decent people.
I’m currently hoping we can figure some kind of new governance solution for managing decentralized power while achieving adequate safety inspections.
https://www.lesswrong.com/posts/FEcw6JQ8surwxvRfr/human-takeover-might-be-worse-than-ai-takeover?commentId=uSPR9svtuBaSCoJ5P