First: I see that your overarching point is to denounce political violence, and thank you for that.
Second: A response to
And the few who feel really personally bothered by that law [against AGI development]?
They may be sad. They’ll definitely be angry. But they’ll survive. They wouldn’t actually survive otherwise.
What if the effect of AGI development would be our reform instead of our extinction? What if current social injustice is a necessary consequence of human limitations which AGI could overcome (as found in https://dx.doi.org/10.2139/ssrn.6194078)? Then the people who should feel personally bothered by your proposed law are victims of social injustice, and your post is claiming that the mere risk of existential threat justifies perpetuating their victimhood!
You admit that your proposed law would be useless unless enforced across the entire planet, forbidding all 8 billion of us from even exploring a potential path to social justice. Can you empathize with people who do not call living without hope of justice “surviving”? How’s about we make social justice a prerequisite to your proposed law? How’s about directing your safety efforts not at technical breakthroughs nor the enforcement of laws but at social justice? Don’t stop with denouncing violence—offer an alternative!
In The Day the Earth Stood Still, the challenge was not to outlaw our extinction—it was to show that our species is a successful experiment worth continuing. Do that, then I would find it more rational to support your efforts to resist change...
What if the effect of AGI development would be our reform instead of our extinction?
There is a burden to prove not only that ‘some’ AGI development will be good for humanity (reforming, to use your words), but that all AGI cannot possibly lead to extinction. If someone creates a reforming-AI today, and then the next day, someone creates an evil AI, we will probably still all die.
First: I see that your overarching point is to denounce political violence, and thank you for that.
Second: A response to
What if the effect of AGI development would be our reform instead of our extinction? What if current social injustice is a necessary consequence of human limitations which AGI could overcome (as found in https://dx.doi.org/10.2139/ssrn.6194078)? Then the people who should feel personally bothered by your proposed law are victims of social injustice, and your post is claiming that the mere risk of existential threat justifies perpetuating their victimhood!
You admit that your proposed law would be useless unless enforced across the entire planet, forbidding all 8 billion of us from even exploring a potential path to social justice. Can you empathize with people who do not call living without hope of justice “surviving”? How’s about we make social justice a prerequisite to your proposed law? How’s about directing your safety efforts not at technical breakthroughs nor the enforcement of laws but at social justice? Don’t stop with denouncing violence—offer an alternative!
In The Day the Earth Stood Still, the challenge was not to outlaw our extinction—it was to show that our species is a successful experiment worth continuing. Do that, then I would find it more rational to support your efforts to resist change...
Check Yudkovsky’s other writings (especially fiction) for multiple detailed discussions of these topics.
The simplest way to get rid of social injustice is to get rid of society. Most people would think that not an acceptable cost?
There is a burden to prove not only that ‘some’ AGI development will be good for humanity (reforming, to use your words), but that all AGI cannot possibly lead to extinction. If someone creates a reforming-AI today, and then the next day, someone creates an evil AI, we will probably still all die.