I agree with Eliezer that an imprecisely chosen value function, if relentlessly optimized, is likely to yield a dull universe.
So: you think a “paperclip maximiser” would be “dull”?
How is that remotely defensible? Do you think a “paperclip maximiser” will master molecular nanotechnology, artificial intelligence, space travel, fusion, the art of dismantling planets and stellar farming?
If so, how could that possibly be “dull”? If not, what reason do you have for thinking that those technologies would not help with the making of paper clips?
Apparently-simple processes can easily produce great complexity. That’s one of the lessons of Conway’s game.
So: you think a “paperclip maximiser” would be “dull”?
How is that remotely defensible? Do you think a “paperclip maximiser” will master molecular nanotechnology, artificial intelligence, space travel, fusion, the art of dismantling planets and stellar farming?
If so, how could that possibly be “dull”? If not, what reason do you have for thinking that those technologies would not help with the making of paper clips?
Apparently-simple processes can easily produce great complexity. That’s one of the lessons of Conway’s game.