The most important s-risk seems to be bad stuff happening from bad acausal dynamics, e.g. an AI from another multiverse blackmailing our AI into torturing everyone (or even our AI failing to trade with other AIs to prevent the blackmail).
The biggest worry we should have, if an AI takes over the world, is that an AI in another universe will blackmail our AI into torturing us? Do you understand how lunatic that sounds? :-)
The biggest worry we should have, if an AI takes over the world, is that an AI in another universe will blackmail our AI into torturing us? Do you understand how lunatic that sounds? :-)