There are no alien oughts, though there are alien desires and alien would-wants. They don’t see morality differently from us; the criterion by which they choose is simply not that which we name morality.
There’s a human morality in about the same sense as there’s a human height.
This is a wonderful epigram, though it might be too optimistic. The far more pessimistic version would be “There’s a human morality in about the same sense as there’s a human language.” (This is what Greene seems to believe and it’s a dispute of fact.)
Eliezer, I think your proposed semantics of “ought” is confusing, and doesn’t match up very well with ordinary usage. May I suggest the following alternative?
ought refer’s to X’s would-wants if X is an individual. If X is a group, then ought is the overlap between the oughts of its members.
In ordinary conversation, when people use “ought” without an explicit subscript or possessive, the implicit X is the speaker plus the intended audience (not humanity as a whole).
ETA: The reason we use “ought” is to convince the audience to do or not do something, right? Why would we want to refer to ought, when ought would work just fine for that purpose, and ought covers a lot more ground than ought?
“There’s a human morality in about the same sense as there’s a human language.” (This is what Greene seems to believe and it’s a dispute of fact.)
That seems to hit close to the mark. Human language contains all sorts of features that are more or less universal to humans due to their hardware while also being significantly determined by cultural influences. It also shares the feature that certain types of language (and ‘ought’ systems) are more useful in different cultures or subcultures.
This is a wonderful epigram, though it might be too optimistic. The far more pessimistic version would be
I’m not sure I follow this. Neither seem particularly pessimistic to me and I’m not sure how one could be worse than the other.
Jumping recklessly in at the middle: even granting your premises regarding the scope of ‘ought’, it is not wholly clear that an alien “ought” is impossible. As timtyler pointed out, the Babyeaters in “Three Worlds Collide” probably had a would-want structure within the “ought” cluster in thingspace, and systems of behaviors have been observed in some nonhuman animals which resemble human morality.
I’m not saying it’s likely, though, so this probably constitutes nitpicking.
“There are no alien oughts” and “They don’t see morality differently from us”—these seem like more bizarre-sounding views on the subject of morality—and it seems especially curious to hear them from the author of the “Baby-Eating Aliens” story.
Look, it’s not very complicated: When you see Eliezer write “morality” or “oughts”, read it as “human morality” and “human oughts”.
It isn’t that simple either. Human morality contains a significant component of trying to coerce other humans into doing things that benefit you. Even on a genetic level humans come with significantly different ways of processing moral thoughts. What is often called ‘personality’, particularly in the context of ‘personality type’.
The translation I find useful is to read it as “Eliezer-would-want”. By the definitions Eliezer has given us the two must be identical. (Except, perhaps if Eliezer has for some reason decided to make himself immoral a priori.)
Well then, I don’t understand why you would find statements like “There are no alien [human oughts]” and “They don’t see [human morality] differently from us” bizarre-sounding.
There are no alien oughts, though there are alien desires and alien would-wants. They don’t see morality differently from us; the criterion by which they choose is simply not that which we name morality.
This is a wonderful epigram, though it might be too optimistic. The far more pessimistic version would be “There’s a human morality in about the same sense as there’s a human language.” (This is what Greene seems to believe and it’s a dispute of fact.)
Eliezer, I think your proposed semantics of “ought” is confusing, and doesn’t match up very well with ordinary usage. May I suggest the following alternative?
ought refer’s to X’s would-wants if X is an individual. If X is a group, then ought is the overlap between the oughts of its members.
In ordinary conversation, when people use “ought” without an explicit subscript or possessive, the implicit X is the speaker plus the intended audience (not humanity as a whole).
ETA: The reason we use “ought” is to convince the audience to do or not do something, right? Why would we want to refer to ought, when ought would work just fine for that purpose, and ought covers a lot more ground than ought?
That seems to hit close to the mark. Human language contains all sorts of features that are more or less universal to humans due to their hardware while also being significantly determined by cultural influences. It also shares the feature that certain types of language (and ‘ought’ systems) are more useful in different cultures or subcultures.
I’m not sure I follow this. Neither seem particularly pessimistic to me and I’m not sure how one could be worse than the other.
Jumping recklessly in at the middle: even granting your premises regarding the scope of ‘ought’, it is not wholly clear that an alien “ought” is impossible. As timtyler pointed out, the Babyeaters in “Three Worlds Collide” probably had a would-want structure within the “ought” cluster in thingspace, and systems of behaviors have been observed in some nonhuman animals which resemble human morality.
I’m not saying it’s likely, though, so this probably constitutes nitpicking.
“There are no alien oughts” and “They don’t see morality differently from us”—these seem like more bizarre-sounding views on the subject of morality—and it seems especially curious to hear them from the author of the “Baby-Eating Aliens” story.
Look, it’s not very complicated: When you see Eliezer write “morality” or “oughts”, read it as “human morality” and “human oughts”.
It isn’t that simple either. Human morality contains a significant component of trying to coerce other humans into doing things that benefit you. Even on a genetic level humans come with significantly different ways of processing moral thoughts. What is often called ‘personality’, particularly in the context of ‘personality type’.
The translation I find useful is to read it as “Eliezer-would-want”. By the definitions Eliezer has given us the two must be identical. (Except, perhaps if Eliezer has for some reason decided to make himself immoral a priori.)
Um, that’s what I just said: “presumably you are talking about ought”.
We were then talking about the meaning of ought.
There’s also the issue of whether to discuss ought and ought—which are evidently quite different—due to the shifting moral zeitgeist.
Well then, I don’t understand why you would find statements like “There are no alien [human oughts]” and “They don’t see [human morality] differently from us” bizarre-sounding.
Having established EY meant ought, I was asking about ought.
Maybe you are right—and EY misinterpreted me—and genuinely thought I was asking about ought.
If so, that seems like a rather ridiculous question for me to be asking—and I’m surprised it made it through his sanity checker.
Even if “morality” means “criterion for choosing..”? Their criterion might have a different referent, but that does not imply a different sense. cf. “This planet”. Out of the two, sense has more to do with meaning, since it doesn’t change with changes of place and time.