I think the “work on FAI theory” suggestion made in a comment to the previous post was a good one; not because it would yield an FAI design when the answer was passed back through the cronophone, but because the output would get Archimedes working on the most important problem visible to him.
Alternatively, if we think in hindsight that Archimedes simply doesn’t have the necessary resources to trigger a major catastrophe, and we want him to focus on doing good instead of not doing bad, that could be modified to “build a seed AI, any seed AI”.
Since I’m not currently working on either, I probably shouldn’t be the one speaking that advice, though.
I think the “work on FAI theory” suggestion made in a comment to the previous post was a good one; not because it would yield an FAI design when the answer was passed back through the cronophone, but because the output would get Archimedes working on the most important problem visible to him. Alternatively, if we think in hindsight that Archimedes simply doesn’t have the necessary resources to trigger a major catastrophe, and we want him to focus on doing good instead of not doing bad, that could be modified to “build a seed AI, any seed AI”. Since I’m not currently working on either, I probably shouldn’t be the one speaking that advice, though.