Will Pearson: When you were figuring out how powerful AIs made from silicon were likely to be, did you have a goal that you wanted? Do you want AI to be powerful so it can stop death?
Eliezer: …”Yes” on both counts....
I think you sidestepped the point as it related to your post. Are you’re rationally taking into account the biasing effect your heartfelt hopes exert on the set of hypotheses raised to your concious attention as you conspire to save the world?
I think you sidestepped the point as it related to your post. Are you’re rationally taking into account the biasing effect your heartfelt hopes exert on the set of hypotheses raised to your concious attention as you conspire to save the world?