Hollerith, you are now officially as weird as a Yudkowskian alien. If I ever write this species I’ll name it after you.
Eliezer, to which of the following possibilities would you accord significant probability mass? (1) Richard Hollerith would change his stated preferences if he knew more and thought faster, for all reasonable meanings of “knew more and thought faster”; (2) There’s a reasonable notion of extrapolation under which all normal humans would agree with a goal in the vicinity of Richard Hollerith’s stated goal; (3) There exist relatively normal (non-terribly-mutated) current humans A and B, and reasonable notions of extrapolation X and Y, such that “A’s preferences under extrapolation-notion X” and “B’s preferences under extrapolation-notion Y” differ as radically as your and Richard Holleriths preferences appear to diverge.
Phil, your analysis depends a lot on what the probabilities are without Eliezer.
If Eliezer vanished, what probabilities would you assign to: (A) someone creating a singularity that removes most/all value from this part of the universe; (B) someone creating a positive singularity; (C) something else (e.g., humanity staying around indefinitely without a technological singularity)? Why?