A motivation might simply be a threat: some truthful powerful being saying “Design an algorithm with goal G. If you succeed, I will give you great goods; if you fail, I will destroy you all. The algorithm will never be used in practice, so there are no moral objections to it being designed.”
Note that this isn’t simply a threat. It is a threat combined with an offer. Depending on the nature of the agents in question this can make rather a large difference.
Note that this isn’t simply a threat. It is a threat combined with an offer. Depending on the nature of the agents in question this can make rather a large difference.