I agree Alicorn’s phrasing is better. My own position would literally be: “I want to act so as to maximize the degree to which the world is saved”. In practice this is more likely to be “helping other people to save the world”, but that’s a strategy not a goal.
I’m indifferent to personal glory etc.
I want to maximize something rather like a utility function, so I want my degree of ambition to naturally scale with the opportunities available. If I only have the opportunity to do a very little good, I want to do a very little good. If I have the opportunity to do a lot (even very indirectly), I want to do a lot.
From my point of view, I’m always at the site of the action (or at least, at the site of my own decisions, which is all I can directly control).
Finally, I don’t think I’m a consequentialist. What I’m describing is my volition, not my ethical system. I haven’t quite decided my metaethics—I need to do some more thinking on that, and maybe wait for more of lukeprog’s sequence.
I agree Alicorn’s phrasing is better. My own position would literally be: “I want to act so as to maximize the degree to which the world is saved”. In practice this is more likely to be “helping other people to save the world”, but that’s a strategy not a goal.
I’m indifferent to personal glory etc.
I want to maximize something rather like a utility function, so I want my degree of ambition to naturally scale with the opportunities available. If I only have the opportunity to do a very little good, I want to do a very little good. If I have the opportunity to do a lot (even very indirectly), I want to do a lot.
From my point of view, I’m always at the site of the action (or at least, at the site of my own decisions, which is all I can directly control).
Finally, I don’t think I’m a consequentialist. What I’m describing is my volition, not my ethical system. I haven’t quite decided my metaethics—I need to do some more thinking on that, and maybe wait for more of lukeprog’s sequence.