You could very easily build a much happier life for them just by allocating some resources (land, computronium, whatever) and going by their current values
Well… ok, lets assume a happy life is their single terminal value. Then by definition of their extrapolated values, you couldn’t build a happier life for them if you did anything else other than follow their extrapolated values!
Too bad. Let’s just agree to disagree then, until the brain scanning technology is sufficiently advanced.
So far, I didn’t see a convincing example of a person who truly wished for everyone to die, even in extrapolation.
To them, yes, but not to their CEV.