Third option not considered here (though it may be fairly unlikely)—it may be the case that superintelligence does not provide a substantial enough advantage to be able to control much of humanity, due to implications of chaos theory or something similar. Maybe it would be able to control politics fairly well, but some coordination problems could plausibly be beyond any reasonable finite intelligence, and hence beyond its control.
Third option not considered here (though it may be fairly unlikely)—it may be the case that superintelligence does not provide a substantial enough advantage to be able to control much of humanity, due to implications of chaos theory or something similar. Maybe it would be able to control politics fairly well, but some coordination problems could plausibly be beyond any reasonable finite intelligence, and hence beyond its control.