So, to be clear, it is permitted for you to invoke “all-powerful” capability in the AGI, if that particular all-powerful capability allows you to make an outrageous assertion that wins the argument....
Well, on some level, of course. We’re not trying to design something that will be weak and stupid, you know. There’s no point in an FAI if you only apply it to tasks a human and a brute computer could handle alone. We damn well intend that it become significantly more powerful than we can contain, because that is how powerful it has to be to fix the problems we intend it to fix and yield the benefits we intend it to yield!
Well, on some level, of course. We’re not trying to design something that will be weak and stupid, you know. There’s no point in an FAI if you only apply it to tasks a human and a brute computer could handle alone. We damn well intend that it become significantly more powerful than we can contain, because that is how powerful it has to be to fix the problems we intend it to fix and yield the benefits we intend it to yield!