I think the ‘alien parasite’ metaphor is a very interesting and potentially productive way to think about (rationalist) human consciousness.
Some related points are made by this blog post on ethics. Peter Singer fails to live up to his ethical theories because the alien parasite that writes Singer’s books doesn’t have full control over Singer’s actions. If it did then Singer would be inhuman to us.
The blogger makes the point that those who have managed to suppress their ‘natural’ moral sense in favor of the dictates of an ideological system don’t have a good track record. This ties into a lot of LessWrong themes. An alien parasite that is in full control of its host is potentially a very powerful thing—and something we might reasonably be afraid of. With a human you can always count on certain things, with an alien you never can tell. Maybe it just wants to manufacture paperclips...
In violent agreement with your last paragraph. I’ve long thought that human beings whose utility functions differ significantly from the “natural” norm are often very dangerous to everyone else, especially if they’re smart. Sociopathy, subscribing to ideologies and inventing new ideologies are all examples of this.
I think the ‘alien parasite’ metaphor is a very interesting and potentially productive way to think about (rationalist) human consciousness.
Some related points are made by this blog post on ethics. Peter Singer fails to live up to his ethical theories because the alien parasite that writes Singer’s books doesn’t have full control over Singer’s actions. If it did then Singer would be inhuman to us.
The blogger makes the point that those who have managed to suppress their ‘natural’ moral sense in favor of the dictates of an ideological system don’t have a good track record. This ties into a lot of LessWrong themes. An alien parasite that is in full control of its host is potentially a very powerful thing—and something we might reasonably be afraid of. With a human you can always count on certain things, with an alien you never can tell. Maybe it just wants to manufacture paperclips...
In violent agreement with your last paragraph. I’ve long thought that human beings whose utility functions differ significantly from the “natural” norm are often very dangerous to everyone else, especially if they’re smart. Sociopathy, subscribing to ideologies and inventing new ideologies are all examples of this.