Actually trying to imagine any kind of situation like this, I imagine I am putting my full effort in to saving my children. It is very hard to picture a real situation where it is a choice between one or two of my children vs 500 others. If it is some evil dictator of Alpha or Omega or a p-zombie or some other fiction we like to screw with around here offering me a choice to threaten and manipulate me, then fuck them, I’ll take my children and THEY are the murderers of the other children.
Frankly, it is a lot easier to imagine living with myself and Julia and Melissa after 500 nameless faceless strangers are gone than living every day without Julia and Melissa while 500 nameless faceless strangers wander around somewhere in the world doing whatever it is that nameless faceless strangers that I don’t care about do. In the grand scheme of things, people are going to die, some sooner, some later. Any good or ill I do is apt to soon be lost in the noise as far as the world is concerned, whereas as far as I am concerned, it might have a rather large effect on me.
So here’s an interesting theory, if I work hard to please myself and everybody did that, would total utility be increased? I know I remember studying something like that theory 35 years ago.
I can imagine trying to save my children and increasing my risk of failure somewhat in order to save other children with my children. I can’t easily quantify this. I would delay leaving with my kids at the risk of being “too late” in order to get some other kids out of something dangerous, and a hell of a lot fewer than 500. It is hardly a matter of indifference. More a sense of agency and responsibility, my job is my kids, my relatives, my friends.
I do see morality as an aesthetic decision. It is clearly not derivable. It is pretty clearly based on feelings we have which we have evolved with the help of natural selection. Which is to say our moral sentiments carry no moral weight, they are just sentiments. We can use math to determine features of simple models of how our sentiments work, but when our math or our model flies in the face of our sentiments, there isn’t a reason in the world to put the map before the territory and the territory is our sentiments.
A million others? What about 1 million others? Would I escape the earth with my kids rather than stay and work a plan to save the earth that had a 1⁄6000 chance of succeeding? so a 1⁄6000 chance of succeeding has an expectation value of 1 million lives. Your damn right I would get me and my kids out of there. My moral sentiments have taught me that in matters of life and death 1⁄6000 = 0. Now amount of math based on a model of my sentiments can override my actual sentiments. Do you think it “should”?
Actually trying to imagine any kind of situation like this, I imagine I am putting my full effort in to saving my children. It is very hard to picture a real situation where it is a choice between one or two of my children vs 500 others. If it is some evil dictator of Alpha or Omega or a p-zombie or some other fiction we like to screw with around here offering me a choice to threaten and manipulate me, then fuck them, I’ll take my children and THEY are the murderers of the other children.
Frankly, it is a lot easier to imagine living with myself and Julia and Melissa after 500 nameless faceless strangers are gone than living every day without Julia and Melissa while 500 nameless faceless strangers wander around somewhere in the world doing whatever it is that nameless faceless strangers that I don’t care about do. In the grand scheme of things, people are going to die, some sooner, some later. Any good or ill I do is apt to soon be lost in the noise as far as the world is concerned, whereas as far as I am concerned, it might have a rather large effect on me.
So here’s an interesting theory, if I work hard to please myself and everybody did that, would total utility be increased? I know I remember studying something like that theory 35 years ago.
I can imagine trying to save my children and increasing my risk of failure somewhat in order to save other children with my children. I can’t easily quantify this. I would delay leaving with my kids at the risk of being “too late” in order to get some other kids out of something dangerous, and a hell of a lot fewer than 500. It is hardly a matter of indifference. More a sense of agency and responsibility, my job is my kids, my relatives, my friends.
I do see morality as an aesthetic decision. It is clearly not derivable. It is pretty clearly based on feelings we have which we have evolved with the help of natural selection. Which is to say our moral sentiments carry no moral weight, they are just sentiments. We can use math to determine features of simple models of how our sentiments work, but when our math or our model flies in the face of our sentiments, there isn’t a reason in the world to put the map before the territory and the territory is our sentiments.
A million others? What about 1 million others? Would I escape the earth with my kids rather than stay and work a plan to save the earth that had a 1⁄6000 chance of succeeding? so a 1⁄6000 chance of succeeding has an expectation value of 1 million lives. Your damn right I would get me and my kids out of there. My moral sentiments have taught me that in matters of life and death 1⁄6000 = 0. Now amount of math based on a model of my sentiments can override my actual sentiments. Do you think it “should”?