The article seems quite incomplete without even mentioning value-of-information. Instrumental and epistemic rationality have the same goals when the VOI of learning things is positive, and opposite goals when the total VOI is negative. Now, it may be hard to capture the VOI of, say, movie spoilers and truths that are bad for you, but the typical piece of information is positive VOI. In other words, most information merely lets you make better choices, as opposed to influencing your experiences in a predicted-in-advance negative manner.
This is basically the entire reasoning for going on an information diet. Not all truths are of equal value to you, so if you can deliberately get only the high value truths, you’re consistently better off.
And when applying/calculating VoI, allow for the opportunity cost of harvesting information. A truth might have positive VoI in itself, but its effective net VoI might be negative if reaching that truth eats up one’s time, money, attention, effort, or other resources.
I agree that VoI and the calculations that allow you to use it effectively are very important. However, this post serves as a basic overview and I think taking the time to explain VoI and how to calculate it wouldn’t fit here.
If you think a post on VoI is necessary as a “sequel” to this one, feel free to write it—I don’t have time with my current queue of things to write—but please link me if and when you do!
Thanks for the link. I’m not sure that post says exactly what I would try to say about the topic, but it is certainly interesting and useful in its own right.
I think taking the time to explain VoI and how to calculate it wouldn’t fit here.
I disagree. VoI is essentially a formalized way to describe the instrumental value of figuring out how the world is (or is going to be). As such it’s a very good way to relate instrumental rationality to epistemic rationality.
The article seems quite incomplete without even mentioning value-of-information. Instrumental and epistemic rationality have the same goals when the VOI of learning things is positive, and opposite goals when the total VOI is negative. Now, it may be hard to capture the VOI of, say, movie spoilers and truths that are bad for you, but the typical piece of information is positive VOI. In other words, most information merely lets you make better choices, as opposed to influencing your experiences in a predicted-in-advance negative manner.
This is basically the entire reasoning for going on an information diet. Not all truths are of equal value to you, so if you can deliberately get only the high value truths, you’re consistently better off.
And when applying/calculating VoI, allow for the opportunity cost of harvesting information. A truth might have positive VoI in itself, but its effective net VoI might be negative if reaching that truth eats up one’s time, money, attention, effort, or other resources.
I agree that VoI and the calculations that allow you to use it effectively are very important. However, this post serves as a basic overview and I think taking the time to explain VoI and how to calculate it wouldn’t fit here.
If you think a post on VoI is necessary as a “sequel” to this one, feel free to write it—I don’t have time with my current queue of things to write—but please link me if and when you do!
I wrote one a while back.
Thanks for the link. I’m not sure that post says exactly what I would try to say about the topic, but it is certainly interesting and useful in its own right.
I disagree. VoI is essentially a formalized way to describe the instrumental value of figuring out how the world is (or is going to be). As such it’s a very good way to relate instrumental rationality to epistemic rationality.