Judge them as badly or as well as you like. I wonder how much they will care?
How often do you suppose that humans find themselves in a situation where they must choose between their own children and some larger number of other people’s children. At one level, hardly ever. At another level, I’m about to buy my 16 year old daughter a car for a few thousand dollars, presumably that few thousand dollars could save the lives of a few tens of children somewhere else? Presumably every time I have taken my children to Disneyland or on a nice vacation I have used resources that could have saved many other children?
Judge me if you like. I hope my children will do the same for their children as I have done for them.
What you describe would fit well within not-going-to-judge-you if you were spending that amount of money on yourself, so I’m hardly going to judge you for spending it on your children.
I wonder how much they will care?
This seems irrelevant? Of course if someone has different values to me, then they’re not going to particularly care if they violate my values.
But are you saying that you would save your own child over 500 others? Would you judge someone badly for saving their own child over 500 others? What about a million others?
The point I’m trying to get across is that I feel there’s a limit to how much you “should” value your own child over other children, and your original post ignored this.
Actually trying to imagine any kind of situation like this, I imagine I am putting my full effort in to saving my children. It is very hard to picture a real situation where it is a choice between one or two of my children vs 500 others. If it is some evil dictator of Alpha or Omega or a p-zombie or some other fiction we like to screw with around here offering me a choice to threaten and manipulate me, then fuck them, I’ll take my children and THEY are the murderers of the other children.
Frankly, it is a lot easier to imagine living with myself and Julia and Melissa after 500 nameless faceless strangers are gone than living every day without Julia and Melissa while 500 nameless faceless strangers wander around somewhere in the world doing whatever it is that nameless faceless strangers that I don’t care about do. In the grand scheme of things, people are going to die, some sooner, some later. Any good or ill I do is apt to soon be lost in the noise as far as the world is concerned, whereas as far as I am concerned, it might have a rather large effect on me.
So here’s an interesting theory, if I work hard to please myself and everybody did that, would total utility be increased? I know I remember studying something like that theory 35 years ago.
I can imagine trying to save my children and increasing my risk of failure somewhat in order to save other children with my children. I can’t easily quantify this. I would delay leaving with my kids at the risk of being “too late” in order to get some other kids out of something dangerous, and a hell of a lot fewer than 500. It is hardly a matter of indifference. More a sense of agency and responsibility, my job is my kids, my relatives, my friends.
I do see morality as an aesthetic decision. It is clearly not derivable. It is pretty clearly based on feelings we have which we have evolved with the help of natural selection. Which is to say our moral sentiments carry no moral weight, they are just sentiments. We can use math to determine features of simple models of how our sentiments work, but when our math or our model flies in the face of our sentiments, there isn’t a reason in the world to put the map before the territory and the territory is our sentiments.
A million others? What about 1 million others? Would I escape the earth with my kids rather than stay and work a plan to save the earth that had a 1⁄6000 chance of succeeding? so a 1⁄6000 chance of succeeding has an expectation value of 1 million lives. Your damn right I would get me and my kids out of there. My moral sentiments have taught me that in matters of life and death 1⁄6000 = 0. Now amount of math based on a model of my sentiments can override my actual sentiments. Do you think it “should”?
The point I’m trying to get across is that I feel there’s a limit to how much you “should” value your own child over other children, and your original post ignored this.
I would not kidnap a strange child and kill her for her kidney in order to save the life of my child. SO in that sense my own moral sentiments respect significant limits on the lives of others over my own children.
But given a chance to save my child vs save a bus going over a cliff I’d save my child.
At a sort of deep level I don’t even care if it makes sense. It arises from feelings bred into my brain for 10s or 100s of millions of years before a human neocortex was even a twinkle in the flying spaghetti monster’s eye.
Judge them as badly or as well as you like. I wonder how much they will care?
How often do you suppose that humans find themselves in a situation where they must choose between their own children and some larger number of other people’s children. At one level, hardly ever. At another level, I’m about to buy my 16 year old daughter a car for a few thousand dollars, presumably that few thousand dollars could save the lives of a few tens of children somewhere else? Presumably every time I have taken my children to Disneyland or on a nice vacation I have used resources that could have saved many other children?
Judge me if you like. I hope my children will do the same for their children as I have done for them.
What you describe would fit well within not-going-to-judge-you if you were spending that amount of money on yourself, so I’m hardly going to judge you for spending it on your children.
This seems irrelevant? Of course if someone has different values to me, then they’re not going to particularly care if they violate my values.
But are you saying that you would save your own child over 500 others? Would you judge someone badly for saving their own child over 500 others? What about a million others?
The point I’m trying to get across is that I feel there’s a limit to how much you “should” value your own child over other children, and your original post ignored this.
Actually trying to imagine any kind of situation like this, I imagine I am putting my full effort in to saving my children. It is very hard to picture a real situation where it is a choice between one or two of my children vs 500 others. If it is some evil dictator of Alpha or Omega or a p-zombie or some other fiction we like to screw with around here offering me a choice to threaten and manipulate me, then fuck them, I’ll take my children and THEY are the murderers of the other children.
Frankly, it is a lot easier to imagine living with myself and Julia and Melissa after 500 nameless faceless strangers are gone than living every day without Julia and Melissa while 500 nameless faceless strangers wander around somewhere in the world doing whatever it is that nameless faceless strangers that I don’t care about do. In the grand scheme of things, people are going to die, some sooner, some later. Any good or ill I do is apt to soon be lost in the noise as far as the world is concerned, whereas as far as I am concerned, it might have a rather large effect on me.
So here’s an interesting theory, if I work hard to please myself and everybody did that, would total utility be increased? I know I remember studying something like that theory 35 years ago.
I can imagine trying to save my children and increasing my risk of failure somewhat in order to save other children with my children. I can’t easily quantify this. I would delay leaving with my kids at the risk of being “too late” in order to get some other kids out of something dangerous, and a hell of a lot fewer than 500. It is hardly a matter of indifference. More a sense of agency and responsibility, my job is my kids, my relatives, my friends.
I do see morality as an aesthetic decision. It is clearly not derivable. It is pretty clearly based on feelings we have which we have evolved with the help of natural selection. Which is to say our moral sentiments carry no moral weight, they are just sentiments. We can use math to determine features of simple models of how our sentiments work, but when our math or our model flies in the face of our sentiments, there isn’t a reason in the world to put the map before the territory and the territory is our sentiments.
A million others? What about 1 million others? Would I escape the earth with my kids rather than stay and work a plan to save the earth that had a 1⁄6000 chance of succeeding? so a 1⁄6000 chance of succeeding has an expectation value of 1 million lives. Your damn right I would get me and my kids out of there. My moral sentiments have taught me that in matters of life and death 1⁄6000 = 0. Now amount of math based on a model of my sentiments can override my actual sentiments. Do you think it “should”?
I would not kidnap a strange child and kill her for her kidney in order to save the life of my child. SO in that sense my own moral sentiments respect significant limits on the lives of others over my own children.
But given a chance to save my child vs save a bus going over a cliff I’d save my child.
At a sort of deep level I don’t even care if it makes sense. It arises from feelings bred into my brain for 10s or 100s of millions of years before a human neocortex was even a twinkle in the flying spaghetti monster’s eye.