It was not popular, people seem to really prefer the idea of being alive and totally discount the idea that this could happen to them. I doubt I get to live to see an actual answer, but I’m fairly confident that some subset of humans would impose eternal agony on copies of themselves.
In a big enough universe “you” are being tortured somewhere, so the goal is to reduce the fraction of you being tortured. Pulping a brain might increase this fraction.
I previously proposed pulping brains before they could be scanned and uploaded to avoid potential exposure to s-risk here: https://www.lesswrong.com/posts/BSo7PLHQhLWbobvet/unethical-human-behavior-incentivised-by-existence-of-agi My core thesis is that if s-risk is real, pulping a single brain could prevent more suffering than any other EA style intervention.
It was not popular, people seem to really prefer the idea of being alive and totally discount the idea that this could happen to them. I doubt I get to live to see an actual answer, but I’m fairly confident that some subset of humans would impose eternal agony on copies of themselves.
In a big enough universe “you” are being tortured somewhere, so the goal is to reduce the fraction of you being tortured. Pulping a brain might increase this fraction.