Like Alicorn, I expect to have wanted to care, so I maximize what I think my (human equivalent of a) utility function is ultimately going to be when I’m smarter and wiser and awesomer.
Like Alicorn, I expect to have wanted to care, so I maximize what I think my (human equivalent of a) utility function is ultimately going to be when I’m smarter and wiser and awesomer.