This seems like a strange reaction. If an alien read this post and believed the claims, wouldn’t they think fascism was pretty likely very much on the rise? There’s global trends, and there’s a bunch of specific examples. Do you agree with that?
Maybe you have some reasons that this prima facie evidence isn’t actually strong evidence. What are those reasons?
But the “by induction”/”line go up” argument for AI risk is not the reason one should be worried; one should be worried for specific causal reasons that we expect unaligned ASI to cause extremely bad outcomes.
One should be worried because of a combination of specific causal reasons to expect ASI to be very bad for us, plus various lines (compute, capabilities, research investment, research insights, economic benefit) going way up. If the lines weren’t going up, there’d be no great reason to expect ASI in the next 50 years with significant probability. We know dictatorships are bad because we’ve seen it; and we have fascism lines going up.
I might agree with a more limited claim like “most people in our reference class underestimate the chances of western democracies turning into fascist dictatorships over the next decade”.
I don’t think someone reading this post should have >50% odds on >50% of western democracies turning into fascist dictatorships over the next decade or two, no. I don’t see an argument that “fascist dictatorship” is a stable attractor; as others have pointed out, even countries which started out much closer to that endpoint have mostly not ended up there after a couple of decades despite appearing to move in that direction.
This seems like a strange reaction. If an alien read this post and believed the claims, wouldn’t they think fascism was pretty likely very much on the rise? There’s global trends, and there’s a bunch of specific examples. Do you agree with that?
Maybe you have some reasons that this prima facie evidence isn’t actually strong evidence. What are those reasons?
One should be worried because of a combination of specific causal reasons to expect ASI to be very bad for us, plus various lines (compute, capabilities, research investment, research insights, economic benefit) going way up. If the lines weren’t going up, there’d be no great reason to expect ASI in the next 50 years with significant probability. We know dictatorships are bad because we’ve seen it; and we have fascism lines going up.
I might agree with a more limited claim like “most people in our reference class underestimate the chances of western democracies turning into fascist dictatorships over the next decade”.
I don’t think someone reading this post should have >50% odds on >50% of western democracies turning into fascist dictatorships over the next decade or two, no. I don’t see an argument that “fascist dictatorship” is a stable attractor; as others have pointed out, even countries which started out much closer to that endpoint have mostly not ended up there after a couple of decades despite appearing to move in that direction.