A limit to punishment

In my last post I made this mind experiment:

Let’s assume that Omega has an evil twin: Psi.

Psi really likes to play with human minds, and today it will force some people to do difficult choices.

Alice is a very nice person: she does volunteer, donates moneys to important causes, is vegetarian, and she is always kind to the people she interact with.

Bob on the other hand is a murderer and torturer: he has kidnapped lots of people and then tortured them for many days in horrible ways that are worse than your deepest fears.

Psi offers two option:

A) Alice will be tortured for 1 day and then killed, while Bob will be instantly killed in a painless way.

B) Bob will be tortured for 3 days and then killed, while Alice will be instantly killed in a painless way.

If you refuse to answer, everyone will be tortured for a year and then killed.

No matter what you will choose, you too will be killed and no one will ever know what has appended.

1. What would you answer?

2. What would you answer if the 3 days in option B) were substituted by 3 years?

3. And if they were substituted by 33 years?

Even if the most compassionate answer is clearly A, most people I know (me included) prefer B, at least in the first case.

This is not surprising: anger can motivate humans to contrast perceived injustices committed toward ourselves or other member of our community and so it helps to ensure the preservation of collaborative behaviors. In other words, many people usually adopt til for tat strategies.

Moreover, I think that we aren’t able to really empathize with psychopaths: even a person with a hypothetical perfect empathy, able to feel all Bob’s emotions, would ultimately condemn him, because she would have to feel his pain and fear but also his sadism and lack of empathy, which would create contrasting emotions.

On the other hand, this person would perceive other’s injustices as her own and she would feel empathic anger against Bob. Adopting this kind of empathy based ethic would probably result in a sort of tit for tat strategy based on intentions and attitudes rather than past actions.

Another reason to choose B, which could work even with a Golden Rule based morality, is signaling: choosing B here, in this moment, you signal that you care about merit, or rather you signal that you have certain emotions in response of certain behavior and that these emotions will guide your actions. Which can motivate other to behave in a certain way. When choosing B people deliver a double message:

1) If you will hurt other people then you will be hurt by me.

2) If you will protect other then you will be protected by me.

The first part of the message is not very useful since psychopaths have a lower fear of punishments due to an increased boldness, but the second one could be effective since it signals to people of good will that we want to protect them, especially for their behavior, and this could further motivate them to continue doing good deeds.

On the other hand, I think that many people dislike the lack of recognition of merit, and a society that doesn’t motivate others to behave right, will be more instable and will cause more pain in the long run.

However there is something that should be taken in account before choosing B: a society based on raw reciprocity would inflict pain on people who have been brainwashed to commit crimes, or have been traumatized until they developed some antisocial behavior, or more realistically, who have received some types of head injuries; and even if very small, it is a possibility that could occur to us.

So I would like to ask: if you could program a society and live in it, would you set a maximum amount of torture in option B, after which people should choose A? If yes where and why?