We must commit to improving morality and society along with science, technology, and industry.
How would you translate this into practice? For example one way to commit to this would be to create some persistent governance structures that can ensure this over time. To be more concrete let’s say it’s a high level department within a world government that has the power to pause or roll back material progress from time to time in order for moral progress to catch up or to avoid imminent disaster.
A less drastic idea is to have AI regulations that say that nobody is allowed to deploy AIs that are better at making material progress than moral/social progress.
Or see “the long reflection” for a more drastic idea.
Which of these would you support, or what do you have in mind yourself?
How would you translate this into practice? For example one way to commit to this would be to create some persistent governance structures that can ensure this over time. To be more concrete let’s say it’s a high level department within a world government that has the power to pause or roll back material progress from time to time in order for moral progress to catch up or to avoid imminent disaster.
A less drastic idea is to have AI regulations that say that nobody is allowed to deploy AIs that are better at making material progress than moral/social progress.
Or see “the long reflection” for a more drastic idea.
Which of these would you support, or what do you have in mind yourself?