I definitely think the “benevolent godlike singleton” is just as likely to fail in horrifying ways as any other scenario. Once you permanently give away all your power, how do you guarantee any bargain?
This is why you won’t build a benevelont godlike singleton until you have vastly more knowledge than we currently have (i.e, with augmenting human intelligence, etc)[1]
I’m not sure I buy the current orientation Eliezer/Nate have to augmented human intelligence in the context of global shutdown, but, does seem like a thing you want before building anything that’s likely to escalate to full overwhelming-in-the-limit-superintelligence.
This is why you won’t build a benevelont godlike singleton until you have vastly more knowledge than we currently have (i.e, with augmenting human intelligence, etc)[1]
I’m not sure I buy the current orientation Eliezer/Nate have to augmented human intelligence in the context of global shutdown, but, does seem like a thing you want before building anything that’s likely to escalate to full overwhelming-in-the-limit-superintelligence.