It seems safer, but I’m not sure about “much safer”. You now have an extremely powerful AI that takes human commands, lots of people and governments would want to get their hands on it, and geopolitics is highly destabilized due to your unilateral actions. What are your next steps to ensure continued safety?
Anything that “decisively settles a win or loss, or drastically changes the probability of win or loss, or changes the future conditions under which a win or loss is determined” qualifies as a pivotal event. If you’re arguing that this specific example doesn’t change the probability of winning enough (and you do bring up good points!), then this example might not qualify as a pivotal event.
I think the examples in that Arbital post are actually intended to be realistic examples (i.e., something that MIRI or at least Eliezer would consider doing if they managed to build a safe and powerful task AGI). If you have reason to think otherwise, please explain.
My initial objection: Considering the upload pivotal event, how likely is it that the first pivotal event is uploading alignment researchers? Multiply that by the probability that alignment researchers have access to the first task AGI capable of uploading. (I’m equating “realistic” with “likely”)
Though by this logic, the most realistic/likely pivotal event is the one that requires the least amount of absolute and relative advantage, and all other pivotal events are “unrealistic”. For example, uploading and shutting down hostile AGI requires a certain level of capability and relative advantage (the uploading example assumes you’re the first to gain uploading capabilities), but those two examples probably aren’t the best pivotal event for the smallest capability advantage.
So my definition of “realistic pivotal event” might not be useful since the only events that could qualify are the top 100 pivotal events (rated by least capability advantage required), and coming up with 1 of those pivotal events may very well require an AGI.
Anything that “decisively settles a win or loss, or drastically changes the probability of win or loss, or changes the future conditions under which a win or loss is determined” qualifies as a pivotal event. If you’re arguing that this specific example doesn’t change the probability of winning enough (and you do bring up good points!), then this example might not qualify as a pivotal event.
My initial objection: Considering the upload pivotal event, how likely is it that the first pivotal event is uploading alignment researchers? Multiply that by the probability that alignment researchers have access to the first task AGI capable of uploading. (I’m equating “realistic” with “likely”)
Though by this logic, the most realistic/likely pivotal event is the one that requires the least amount of absolute and relative advantage, and all other pivotal events are “unrealistic”. For example, uploading and shutting down hostile AGI requires a certain level of capability and relative advantage (the uploading example assumes you’re the first to gain uploading capabilities), but those two examples probably aren’t the best pivotal event for the smallest capability advantage.
So my definition of “realistic pivotal event” might not be useful since the only events that could qualify are the top 100 pivotal events (rated by least capability advantage required), and coming up with 1 of those pivotal events may very well require an AGI.