Choosing TORTURE is making a decision to condemn someone to fifty years of torture, while knowing that 3^^^3 people would not want you to do so, would beg you not to, would react with horror and revulsion if/when they knew you did it.
And you must do it for the sake of some global principle or something.
I’d say it puts one at least into Well-intentioned Extremist / KnightTemplar category, if not outright villain.
If an AI had made a choice like that, against known wishes of practically everyone, I’d say it was rather unfriendly.
Choosing TORTURE is making a decision to condemn someone to fifty years of torture, while knowing that 3^^^3 people would not want you to do so, would beg you not to, would react with horror and revulsion if/when they knew you did it. And you must do it for the sake of some global principle or something. I’d say it puts one at least into Well-intentioned Extremist / KnightTemplar category, if not outright villain.
If an AI had made a choice like that, against known wishes of practically everyone, I’d say it was rather unfriendly.
ADDED: Detailed