The goal of instrumental rationality mostly speaks for itself. Some commenters have wondered, on the other hand, why rationalists care about truth. Which invites a few different answers, depending on who you ask; and these different answers have differing characters, which can shape the search for truth in different ways.
You might hold the view that pursuing truth is inherently noble, important, and worthwhile. In which case your priorities will be determined by your ideals about which truths are most important, or about when truthseeking is most virtuous.
This motivation tends to have a moral character to it. If you think it your duty to look behind the curtain, you are a lot more likely to believe that someone else should look behind the curtain too, or castigate them if they deliberately close their eyes.
I tend to be suspicious of morality as a motivation for rationality, not because I reject the moral ideal, but because it invites certain kinds of trouble. It is too easy to acquire, as learned moral duties, modes of thinking that are dreadful missteps in the dance.
Consider Spock, the naive archetype of rationality. Spock’s affect is always set to “calm,” even when wildly inappropriate. He often gives many significant digits for probabilities that are grossly uncalibrated.1 Yet this popular image is how many people conceive of the duty to be “rational”—small wonder that they do not embrace it wholeheartedly.
To make rationality into a moral duty is to give it all the dreadful degrees of freedom of an arbitrary tribal custom. People arrive at the wrong answer, and then indignantly protest that they acted with propriety, rather than learning from their mistake.
What other motives are there?
Well, you might want to accomplish some specific real-world goal, like building an airplane, and therefore you need to know some specific truth about aerodynamics. Or more mundanely, you want chocolate milk, and therefore you want to know whether the local grocery has chocolate milk, so you can choose whether to walk there or somewhere else.
If this is the reason you want truth, then the priority you assign to your questions will reflect the expected utility of their information—how much the possible answers influence your choices, how much your choices matter, and how much you expect to find an answer that changes your choice from its default.
To seek truth merely for its instrumental value may seem impure—should we not desire the truth for its own sake?—but such investigations are extremely important because they create an outside criterion of verification: if your airplane drops out of the sky, or if you get to the store and find no chocolate milk, its a hint that you did something wrong. You get back feedback on which modes of thinking work, and which don’t.
Another possibility: you might care about whats true because, damn it, you’re curious.
As a reason to seek truth, curiosity has a special and admirable purity. If your motive is curiosity, you will assign priority to questions according to how the questions, themselves, tickle your aesthetic sense. A trickier challenge, with a greater probability of failure, may be worth more effort than a simpler one, just because it’s more fun.
Curiosity and morality can both attach an intrinsic value to truth. Yet being curious about whats behind the curtain is a very different state of mind from believing that you have a moral duty to look there. If you’re curious, your priorities will be determined by which truths you find most intriguing, not most important or most useful.
Although pure curiosity is a wonderful thing, it may not linger too long on verifying its answers, once the attractive mystery is gone. Curiosity, as a human emotion, has been around since long before the ancient Greeks. But what set humanity firmly on the path of Science was noticing that certain modes of thinking uncovered beliefs that let us manipulate the world—truth as an instrument. As far as sheer curiosity goes, spinning campfire tales of gods and heroes satisfied that desire just as well, and no one realized that anything was wrong with that.
At the same time, if we’re going to improve our skills of rationality, go beyond the standards of performance set by hunter-gatherers, we’ll need deliberate beliefs about how to think—things that look like norms of rationalist “propriety.” When we write new mental programs for ourselves, they start out as explicit injunctions, and are only slowly (if ever) trained into the neural circuitry that underlies our core motivations and habits.
Curiosity, pragmatism, and quasi-moral injunctions are all key to the rationalist project. Yet if you were to ask me which of these is most foundational, I would say: “curiosity.” I have my principles, and I have my plans, which may well tell me to look behind the curtain. But then, I also just really want to know. What will I see? The world has handed me a puzzle, and a solution feels tantalizingly close.
1 E.g., “Captain, if you steer the Enterprise directly into that black hole, our probability of surviving is only 2.234%.” Yet nine times out of ten the Enterprise is not destroyed. What kind of tragic fool gives four significant digits for a figure that is off by two orders of magnitude?