I think you mean something different by “ethics” than I do. I’m more of a realist. I’m not much interested in an ethics whose purpose is to describe human behavior. I’d rather use a different term for that.
If you’re trying to take a God’s eye view, in order to design a friendly AI and chart the future course of the universe, then the approach you’re suggesting would be overly anthropocentric.
My realism might be weaker than yours but I think I was just confusing in part of the OP. Normative ethics isn’t about explaining our intuitions (even though I say that, I misspoke). It is about what we should do. But we have no access to information about what we should do except through our ethical intuitions. There are cases where our intuitions don’t supply answers and that is why it is a good idea to generalize and formalize our ethical intuitions so that they can be applied to tough cases.
Let me ask you, since you’re a moral realist do you believe you have moral knowledge? If so, how did you get it?
I think you mean something different by “ethics” than I do. I’m more of a realist. I’m not much interested in an ethics whose purpose is to describe human behavior. I’d rather use a different term for that.
If you’re trying to take a God’s eye view, in order to design a friendly AI and chart the future course of the universe, then the approach you’re suggesting would be overly anthropocentric.
My realism might be weaker than yours but I think I was just confusing in part of the OP. Normative ethics isn’t about explaining our intuitions (even though I say that, I misspoke). It is about what we should do. But we have no access to information about what we should do except through our ethical intuitions. There are cases where our intuitions don’t supply answers and that is why it is a good idea to generalize and formalize our ethical intuitions so that they can be applied to tough cases.
Let me ask you, since you’re a moral realist do you believe you have moral knowledge? If so, how did you get it?