I remember vaguely Yudkowsky’s view on sex, that there is a correct way to be sexual, that the correct way is to get into some kind of traditional, romantic relationship, full of interpersonal emotional connection and intellectual thought, instead of just porn and cute catgirls/catboys.
I am quite confident Eliezer’s stance on relationships is to do whatever you want, at least from an ethics standpoint. A “traditional, romantic relationship” is really not what I would expect Eliezer to advocate.
If I remember correctly, the argument was that, I’ll use my own words here, dealing with problems in style “relationships between genders are too difficult, let’s use sexbots instead and everyone will be happy” ultimately leads to wireheading, if you apply it consistently to all your desires, not just sexual ones.
This is not a general argument against making things easier. If the difficulty is something that kills you, or hurts you irreversibly, that kind of difficulty should be removed. Or humans should be modified into more resilient, so they can overcome any existing difficulty without permanent damage. In the Nietzschean “what doesn’t kill you, makes you stronger”, the first option is bad, but the second option is good.
The context for the argument was the long game: if we succeed to conquer the universe by inventing a friendly superhuman intelligence, what then? If you could do literally anything, what is the meaningful thing to do? Eliezer’s proposal was that the superintelligence should make us immortal (and otherwise not-irreversibly-damageable), but then let us overcome the remaining problems by ourselves. To give us unlimited time and opportunities to become stronger. As opposed to a nanny-bot that would make us weaker by satisfying all our base desires without us having to do anything, which would lead to our skills and intelligence gradually atrophying to sub-human, and even sub-animal, levels. (By the way, the theme of “does this make you weaker or stronger?” is also present in other parts of the Sequences.)
I am quite confident Eliezer’s stance on relationships is to do whatever you want, at least from an ethics standpoint. A “traditional, romantic relationship” is really not what I would expect Eliezer to advocate.
If I remember correctly, the argument was that, I’ll use my own words here, dealing with problems in style “relationships between genders are too difficult, let’s use sexbots instead and everyone will be happy” ultimately leads to wireheading, if you apply it consistently to all your desires, not just sexual ones.
This is not a general argument against making things easier. If the difficulty is something that kills you, or hurts you irreversibly, that kind of difficulty should be removed. Or humans should be modified into more resilient, so they can overcome any existing difficulty without permanent damage. In the Nietzschean “what doesn’t kill you, makes you stronger”, the first option is bad, but the second option is good.
The context for the argument was the long game: if we succeed to conquer the universe by inventing a friendly superhuman intelligence, what then? If you could do literally anything, what is the meaningful thing to do? Eliezer’s proposal was that the superintelligence should make us immortal (and otherwise not-irreversibly-damageable), but then let us overcome the remaining problems by ourselves. To give us unlimited time and opportunities to become stronger. As opposed to a nanny-bot that would make us weaker by satisfying all our base desires without us having to do anything, which would lead to our skills and intelligence gradually atrophying to sub-human, and even sub-animal, levels. (By the way, the theme of “does this make you weaker or stronger?” is also present in other parts of the Sequences.)