Because beings like us (in some relevant respects) that outlive us might carry the torch of our values into that future! Don’t read too much into the word “interesting” here: I just meant “valuable by our lights in some respect, even if only slightly.
Sure, it sucks if humanity doesn’t live to be happy in the distant future, but if some other AGI civilization is happy in that future, I prefer that to an empty paperclip-maximized universe without any happiness.
That all sounds fair. I’ve seen rationalists claim before that it’s better for “interesting” things (in the literal sense) to exist than not, even if nothing sentient is interested by them, so that’s why I assumed you meant the same.
Because beings like us (in some relevant respects) that outlive us might carry the torch of our values into that future! Don’t read too much into the word “interesting” here: I just meant “valuable by our lights in some respect, even if only slightly.
Sure, it sucks if humanity doesn’t live to be happy in the distant future, but if some other AGI civilization is happy in that future, I prefer that to an empty paperclip-maximized universe without any happiness.
That all sounds fair. I’ve seen rationalists claim before that it’s better for “interesting” things (in the literal sense) to exist than not, even if nothing sentient is interested by them, so that’s why I assumed you meant the same.