I think politics often involves bidding for the compromise you think is feasible, rather than what you’ld ideally want.
whats maybe different in the AI risk case, and others like it, is how you’ll be regarded when things go wrong.
hypothetical scenario
An AI does something destructive, on the order of 9/11
Government over-reacts as usual, and cracks down on the AI companies like the US did on Osama bin Laden, or Israel did on Hamas.
you are like, yeah we knew that was going to happen
governmet to you, what the fuck? Why didn’t you tell us?
I think politics often involves bidding for the compromise you think is feasible, rather than what you’ld ideally want.
whats maybe different in the AI risk case, and others like it, is how you’ll be regarded when things go wrong.
hypothetical scenario
An AI does something destructive, on the order of 9/11
Government over-reacts as usual, and cracks down on the AI companies like the US did on Osama bin Laden, or Israel did on Hamas.
you are like, yeah we knew that was going to happen
governmet to you, what the fuck? Why didn’t you tell us?