I do say that. I care (in terms of how I actually act) about people I see, people I like, people in my extended networks, and all living people. For example, if someone had a heart attack, I would help them even if rationally, the time I spent could be converted into far more lives through optimal giving.
Sure, but my point is you probably wouldn’t use this example of “caring” as a justification in abstract philosophical debates about, e.g., the ethics of cryonics, because visual-field-dependent morality is absurd enough to make it intuitively reasonable that values you truly care about should hold up to some sort of reflection.
It’s important not to be too loose with the idea of “care in terms of how I actually act”, or you’ll end up saying you care about being near large masses or making hiccup noises. You can plausibly argue that falling and hiccups aren’t behavior in the way that helping someone with a heart attack is, but it’s not like there’s a bright dividing line.
You know the “extended mind” hypothesis that says things like calculators or search engines can in some circumstances be seen as parts of your mind? It seems like the flip side of that is an “abridged mind” hypothesis where some parts of your brain are like alien mind control lasers, except located in your skull.
Sure, but my point is you probably wouldn’t use this example of “caring” as a justification in abstract philosophical debates about, e.g., the ethics of cryonics, because visual-field-dependent morality is absurd enough to make it intuitively reasonable that values you truly care about should hold up to some sort of reflection.
Well, yes. I have a reflectively endorsed belief that being an altruist is good and proper. If I were to endorse selfishness, I would include exceptions for those categories, in increasing order of affect on my decisions.
I do say that. I care (in terms of how I actually act) about people I see, people I like, people in my extended networks, and all living people. For example, if someone had a heart attack, I would help them even if rationally, the time I spent could be converted into far more lives through optimal giving.
Sure, but my point is you probably wouldn’t use this example of “caring” as a justification in abstract philosophical debates about, e.g., the ethics of cryonics, because visual-field-dependent morality is absurd enough to make it intuitively reasonable that values you truly care about should hold up to some sort of reflection.
It’s important not to be too loose with the idea of “care in terms of how I actually act”, or you’ll end up saying you care about being near large masses or making hiccup noises. You can plausibly argue that falling and hiccups aren’t behavior in the way that helping someone with a heart attack is, but it’s not like there’s a bright dividing line.
You know the “extended mind” hypothesis that says things like calculators or search engines can in some circumstances be seen as parts of your mind? It seems like the flip side of that is an “abridged mind” hypothesis where some parts of your brain are like alien mind control lasers, except located in your skull.
Well, yes. I have a reflectively endorsed belief that being an altruist is good and proper. If I were to endorse selfishness, I would include exceptions for those categories, in increasing order of affect on my decisions.