It sounds like I didn’t consider the possibility that Eliezer isn’t trying to be moral—that his concern about AI replacing humans is just self-interested racism, with no need for moral justification beyond the will to power.
It sounds like I didn’t consider the possibility that Eliezer isn’t trying to be moral—that his concern about AI replacing humans is just self-interested racism, with no need for moral justification beyond the will to power.