This isn’t terribly decision-relevant except for deciding what type of alignment work to do. But that does seem nontrival. My bottom line is: push for pause/slowdown, but don’t get overoptimistic. Simultaneously work toward alignment on the current path, as fast as possible, because that might well be our only chance.
To your point:
I take your point on the standard reasoning. I agree that most adults would turn down even a 95-5%, 19 vs 1 odds of improving their lives with a small chance of destroying the world.
But I’m afraid those with decision-making power would take far worse odds in private, where it matters. That’s because for them, it’s not just a better life; it’s immortality vs. dying soon. And that tends to change decision-making.
I added and marked an edit to make this part of the logic explicit.
Most humans with decision-making power, e.g. in goverment, are 50+ years old, and mostly older since power tends to accumulate at least until sharp cognitive declines set in. There’s a pretty good chance they will die of natural causes if ASI isn’t created to do groundbreaking medical research within their lifetimes.
That’s on top of any actual nationalistic tendencies, or fears of being killed, enslaved, tortured, or worse, mocked, by losing the race to one’s political enemies covertly pursuing ASI.
And that’s on top of worrying that sociopaths (or similarly cold/selfish decision-makers) are over-represented in the halls of power. Those arguments seem pretty strong to me, too.
How this would unfold is highly unclear to me. I think it’s important to develop gears-level models of how these processes might happen, as Raemon suggests in this post.
My guess is that covert programs are between likely and inevitable. Public pressure will be for caution; private and powerful opinions will be much harder to predict.
As for the public statements, it works just fine to say, or more likely convince yourself, that you think aligment is solvable with little chance of failure, and that patriotism and horror over (insert enemy ideology here) controlling the future are your motivations.
This isn’t terribly decision-relevant except for deciding what type of alignment work to do. But that does seem nontrival. My bottom line is: push for pause/slowdown, but don’t get overoptimistic. Simultaneously work toward alignment on the current path, as fast as possible, because that might well be our only chance.
To your point:
I take your point on the standard reasoning. I agree that most adults would turn down even a 95-5%, 19 vs 1 odds of improving their lives with a small chance of destroying the world.
But I’m afraid those with decision-making power would take far worse odds in private, where it matters. That’s because for them, it’s not just a better life; it’s immortality vs. dying soon. And that tends to change decision-making.
I added and marked an edit to make this part of the logic explicit.
Most humans with decision-making power, e.g. in goverment, are 50+ years old, and mostly older since power tends to accumulate at least until sharp cognitive declines set in. There’s a pretty good chance they will die of natural causes if ASI isn’t created to do groundbreaking medical research within their lifetimes.
That’s on top of any actual nationalistic tendencies, or fears of being killed, enslaved, tortured, or worse, mocked, by losing the race to one’s political enemies covertly pursuing ASI.
And that’s on top of worrying that sociopaths (or similarly cold/selfish decision-makers) are over-represented in the halls of power. Those arguments seem pretty strong to me, too.
How this would unfold is highly unclear to me. I think it’s important to develop gears-level models of how these processes might happen, as Raemon suggests in this post.
My guess is that covert programs are between likely and inevitable. Public pressure will be for caution; private and powerful opinions will be much harder to predict.
As for the public statements, it works just fine to say, or more likely convince yourself, that you think aligment is solvable with little chance of failure, and that patriotism and horror over (insert enemy ideology here) controlling the future are your motivations.