I think it is definitely not a classical utilitarian view, but that doesn’t trouble me. If you are a classic utilitarian, you can always put the value of S super high.
To explain briefly why I don’t care about trying to create more people, I am motivated by empathy. I want people that exist to be doing well, but I don’t care about maximizing the number of people doing well very much. Utilitarianism seems to imply we should tile the universe with flourishing humans, but that doesn’t seem valuable to me. I don’t see every wasted sperm cell as some kind of tragedy. A future person who could have existed. I don’t think the empty universe before humanity came along was good or bad. I don’t think in terms of good or bad. Things just are, and I like them or I dont. I don’t like when people suffer or die. That’s it.
I was most confused about ‘S’, and likely understood it quite differently than intended.
I understood S as roughly “Humanity stays in control after AGI, but slowly (over decades/centuries) becomes fewer and less relevant”. I’d expect in many of these cases for something morally valuable to replace humans. So I put S lower than R.
Could it make sense to introduce “population after AGI you care about” as a term – I think this could be clearer.
I think it is definitely not a classical utilitarian view, but that doesn’t trouble me. If you are a classic utilitarian, you can always put the value of S super high.
To explain briefly why I don’t care about trying to create more people, I am motivated by empathy. I want people that exist to be doing well, but I don’t care about maximizing the number of people doing well very much. Utilitarianism seems to imply we should tile the universe with flourishing humans, but that doesn’t seem valuable to me. I don’t see every wasted sperm cell as some kind of tragedy. A future person who could have existed. I don’t think the empty universe before humanity came along was good or bad. I don’t think in terms of good or bad. Things just are, and I like them or I dont. I don’t like when people suffer or die. That’s it.
I was most confused about ‘S’, and likely understood it quite differently than intended.
I understood S as roughly “Humanity stays in control after AGI, but slowly (over decades/centuries) becomes fewer and less relevant”. I’d expect in many of these cases for something morally valuable to replace humans. So I put S lower than R.
Could it make sense to introduce “population after AGI you care about” as a term – I think this could be clearer.