I might change my mind in the future, but right now my answers are “to a large extent” and “pretty important”.
Why do you care what my values are, though, or why they are what they are? I find it fascinating that “value-seeking” is a common behavior among rationalist-wannabes (and I’m as guilty of it as anyone). It’s almost as if the most precious resource in this universe isn’t negentropy, but values.
ETA: I see you just answered this in your reply to Adelene Dawner:
I wanted to further qualify how Wei Dai values knowledge, because I don’t see how nudging the far future one way or another is going to increase Wei Dei’s total knowledge.
I expect that I can survive indefinitely in some futures. Does that answer your question?
It’s almost as if the most precious resource in this universe isn’t negentropy, but values.
That’s an amusing observation, with some amount of truth to it.
The reason why I was asking is because I was seeking to understand why you care and I don’t.
Given your reply, I think our difference in caring can be explained by the fact that when I imagine the far future, I don’t imagine myself there. I’m also less attached to my identity; I wouldn’t mind experiencing the optimization of the universe from the point of view of an alien mind, with different values. (This last bit is relevant if you want the future to be good just the sake of it being good, even if you’re not there.)
I might change my mind in the future, but right now my answers are “to a large extent” and “pretty important”.
Why do you care what my values are, though, or why they are what they are? I find it fascinating that “value-seeking” is a common behavior among rationalist-wannabes (and I’m as guilty of it as anyone). It’s almost as if the most precious resource in this universe isn’t negentropy, but values.
ETA: I see you just answered this in your reply to Adelene Dawner:
I expect that I can survive indefinitely in some futures. Does that answer your question?
That’s an amusing observation, with some amount of truth to it.
The reason why I was asking is because I was seeking to understand why you care and I don’t.
Given your reply, I think our difference in caring can be explained by the fact that when I imagine the far future, I don’t imagine myself there. I’m also less attached to my identity; I wouldn’t mind experiencing the optimization of the universe from the point of view of an alien mind, with different values. (This last bit is relevant if you want the future to be good just the sake of it being good, even if you’re not there.)