In short, I have a view of human nature that’s somewhat more optimistic than yours.
I don’t think leaving humans in charge of the world is obviously a win either. It does look to me like the arc of history is bending toward justive, but it’s happening slowly and in fits and starts. And we could all be dead before we get to a stable just society. This isn’t really an argument for building ASI; I think we probably shouldn’t, or at least not this fast.
But it looks like we’re going to.
The big advantage of building intent-aligned AGI (if we can avoid do that instead of build a misaligned ASI that kills us all) is that it makes being good to people vastly easier, essentially completely free. You just tell your ASI “okay fine, go make the world better for people. Tell me how you’d do it and I’ll choose some options”.
This lowers the bar for how good someone has to be to benefit the humanity to just above zero. If they have more inclination to be helpful than harmful, that’s all it takes.
No human who’s ever lived has been in that position. Even the most powerful have had to worry about losing their power, and about themselves and their loved ones dying painfully and fairly soon.
So strangely, I wouldn’t trust Sam Altman with my lunch money, but I would guess he’d probably produce a very good future if he were to wind up god-emperor for eternity. The exposes I’ve seen don’t claim he’s a particularly vengeful person. We’ll just have to celebrate Samday every week :)
There are individuals with what I think of as a negative empathy-sadism balance, but they’re pretty rare. Sociopathic individuals do seem to be overrepresented in the halls of power, but even there I think we’ve got pretty good odds of minimally good people winding up in charge of ASI.
This is not a scenario I’m comfortable with If a sadistic individual gets control of the future, it could be worse than death, a permanent state of suffering. But it would take both a very selfless and competent person to launch such a thing successfully. . I’d almost rather see an attempt at value-aligned AGI.
I’m not sure how to up our odds; getting good people into power is an old challenge and I don’t know of new methods to improve our odds.
These are serious questions.
In short, I have a view of human nature that’s somewhat more optimistic than yours.
I don’t think leaving humans in charge of the world is obviously a win either. It does look to me like the arc of history is bending toward justive, but it’s happening slowly and in fits and starts. And we could all be dead before we get to a stable just society. This isn’t really an argument for building ASI; I think we probably shouldn’t, or at least not this fast.
But it looks like we’re going to.
The big advantage of building intent-aligned AGI (if we can avoid do that instead of build a misaligned ASI that kills us all) is that it makes being good to people vastly easier, essentially completely free. You just tell your ASI “okay fine, go make the world better for people. Tell me how you’d do it and I’ll choose some options”.
This lowers the bar for how good someone has to be to benefit the humanity to just above zero. If they have more inclination to be helpful than harmful, that’s all it takes.
No human who’s ever lived has been in that position. Even the most powerful have had to worry about losing their power, and about themselves and their loved ones dying painfully and fairly soon.
So strangely, I wouldn’t trust Sam Altman with my lunch money, but I would guess he’d probably produce a very good future if he were to wind up god-emperor for eternity. The exposes I’ve seen don’t claim he’s a particularly vengeful person. We’ll just have to celebrate Samday every week :)
There are individuals with what I think of as a negative empathy-sadism balance, but they’re pretty rare. Sociopathic individuals do seem to be overrepresented in the halls of power, but even there I think we’ve got pretty good odds of minimally good people winding up in charge of ASI.
This is not a scenario I’m comfortable with If a sadistic individual gets control of the future, it could be worse than death, a permanent state of suffering. But it would take both a very selfless and competent person to launch such a thing successfully. . I’d almost rather see an attempt at value-aligned AGI.
I’m not sure how to up our odds; getting good people into power is an old challenge and I don’t know of new methods to improve our odds.