Lately when I’m confronted with extreme thought experiments that are repugnant on both sides, my answer has been “mu”. No I can’t give a good answer, and I’m skeptical that anyone can.
Balboa park to West Oakland is our established world. We have been carefully leaning into it’s edge, slowly crafting extensions of our established moral code, adding bits to it and refactoring old parts to make it consistent with the new stuff.
It’s been a mythical effort. People above our level have spent their 1000 year long lifetimes mulling over their humble little additions to the gigantic established machine that is our morality.
And this machine has created Mediocristan. A predictable world, with some predictable features, within which there is always a moral choice available. Without these features our moral programming would be completely useless. We can behave morally precisely because the cases in which there is no moral answer, don’t happen so much.
So please, stop asking me whether I’d kill myself to save 1000 babies from 1000 years of torture. Both outcomes are repugnant and the only good answer I have is “get out of Extremistan”.
The real morality is to steer the world towards a place where we don’t need morality. Extend the borders of Mediocristan to cover a wider set of situations. Bolster it internally so that the intelligence required for a moral choice becomes lower—allowing more people to make it.
No morality is world-independent. If you think you have a good answer to morality, you have to provide it with a description of the worlds in which it works, and a way to make sure we stay within those bounds.
Very nicely written.
A good example of this might be invention of genetic flaw correction, due to which morally controversial abortion could become less desired option.
Lately when I’m confronted with extreme thought experiments that are repugnant on both sides, my answer has been “mu”. No I can’t give a good answer, and I’m skeptical that anyone can.
Balboa park to West Oakland is our established world. We have been carefully leaning into it’s edge, slowly crafting extensions of our established moral code, adding bits to it and refactoring old parts to make it consistent with the new stuff.
It’s been a mythical effort. People above our level have spent their 1000 year long lifetimes mulling over their humble little additions to the gigantic established machine that is our morality.
And this machine has created Mediocristan. A predictable world, with some predictable features, within which there is always a moral choice available. Without these features our moral programming would be completely useless. We can behave morally precisely because the cases in which there is no moral answer, don’t happen so much.
So please, stop asking me whether I’d kill myself to save 1000 babies from 1000 years of torture. Both outcomes are repugnant and the only good answer I have is “get out of Extremistan”.
The real morality is to steer the world towards a place where we don’t need morality. Extend the borders of Mediocristan to cover a wider set of situations. Bolster it internally so that the intelligence required for a moral choice becomes lower—allowing more people to make it.
No morality is world-independent. If you think you have a good answer to morality, you have to provide it with a description of the worlds in which it works, and a way to make sure we stay within those bounds.
Very nicely written. A good example of this might be invention of genetic flaw correction, due to which morally controversial abortion could become less desired option.