Let me us distinguish “truth-seekers”, people who respect and want truth, from “rationalists”, people who personally know how to believe truth. We can build better institutions that produce truth if only we have enough support from truth-seekers; we don’t actually need many rationalists. And having rationalists without good institutions may not produce much more shared accessible truth.
I’m not sure I can let you make that distinction without some more justification.
Most people think they’re truth-seekers and honestly claim to be truth-seekers. But the very existence of biases shows that thinking you’re a truth-seeker doesn’t make it so. Ask a hundred doctors, and they’ll all (without consciously lying!) say they’re looking for the truth about what really will help or hurt their patients. But give them your spiel about the flaws in the health system, and in the course of what they consider seeking the truth, they’ll dismiss your objections in a way you consider unfair. Build an institution that confirms your results, and they’ll dismiss the institution as biased or flawed or “silly”. These doctors are not liars or enemies of truth or anything. They’re normal people whose search for the truth is being hijacked in ways they can’t control.
The solution: turn them into rationalists. They don’t have to be black belt rationalists who can derive Bayes’ Theorem in their sleep, but they have to be rationalist enough that their natural good intentions towards truth-seeking correspond to actual truth-seeking and allow you to build your institutions without interference.
I had in mind that you might convince someone abstractly to support eg prediction markets because they promote truth, and then they would accept the results of such markets even if it disagreed with their intuitions. They don’t have to know how to bet well in such markets to accept that they are a better truth-seeking institution. But yes, being a truth-seeker can be very different from believing that you are one.
Btw, I only just discovered the “inbox” that lets me find responses to my comments.
This sounds like you’re postulating people who have good taste in rationalist institutions without having good taste in rationality. Or you’re postulating that it’s easy to push on the former quantity without pushing on the latter. How likely is this really? Why wouldn’t any such effort be easily hijacked by institutions that look good to non-rationalists?
Eliezer, to the extent that any epistemic progress has been made at all, was it not ever thus?
To give one example: the scientific method is an incredibly powerful tool for generating knowledge, and has been very widely accepted as such for the past two centuries. But even a cursory reading of the history of science reveals that scientists themselves, despite having great taste in rationalist institutions, often had terrible taste in personal rationality. They were frequently petty, biased, determined to believe their own theories regardless of evidence, defamatory and aggressive towards rival theorists, etc. Ultimately, their taste in rational institutions coexisted with frequent lack of taste in personal rationality (certainly, a lack of Eliezer-level taste in personal rationality). It would have been better, no doubt, if they had had both tastes. But they didn’t. But in the end, it wasn’t necessary that they did.
I would also make some other points: 1. People tend to have stronger emotive attachments—and hence stronger biases—in relation to concrete issues (e.g. “is the theory I believe correct”) than epistemic institutions (e.g. “should we do an experiment to confirm the theory”). One reason is that such object level issues tend to be more politicised. Another is that they tend to have a more direct, concrete impact on individual lives (N.B. the actual impact of epistemic institutions is probably much greater, but for triggering our biases, the appearance of direct action is more important (cf thought experiments about sacrificing a single identifiable child to save faceless millions)).
2. Even very object-level biased people can be convinced to follow the same institutional epistemic framework. After all, if they are convinced that the framework is a truth-productive one, they will believe it will ultimately vindicate their theory. I think this is a key reason why competing ideologies agree to free speech, why competing scientists agree to the scientific method, why (by analogy) competing companies agree to free trade, etc. [The question of what happens when one person’s theory begins to lose out under the framework is a different one, but by that stage, if enough people are following the epistemic framework, opting out may be socially impossible (e.g. if a famous scientist said “my theory has been falsified by experiment, so I am abandoning the scientific method!”, they would be a laughing stock)]
3. I really worry that “everyone on Earth is irrational, apart from me and my mates” is an incredibly gratifying and tempting position to hold. The romance of the lone point of light in an ocean of darkness! The drama of leading the fight to begin civilisation itself! The thrill of the hordes of Dark Side Epistemologists, surrounding the besieged outpost of reason! Who would not be tempted? I certainly am. But that is why I suspect.
Let me us distinguish “truth-seekers”, people who respect and want truth, from “rationalists”, people who personally know how to believe truth. We can build better institutions that produce truth if only we have enough support from truth-seekers; we don’t actually need many rationalists. And having rationalists without good institutions may not produce much more shared accessible truth.
I’m not sure I can let you make that distinction without some more justification.
Most people think they’re truth-seekers and honestly claim to be truth-seekers. But the very existence of biases shows that thinking you’re a truth-seeker doesn’t make it so. Ask a hundred doctors, and they’ll all (without consciously lying!) say they’re looking for the truth about what really will help or hurt their patients. But give them your spiel about the flaws in the health system, and in the course of what they consider seeking the truth, they’ll dismiss your objections in a way you consider unfair. Build an institution that confirms your results, and they’ll dismiss the institution as biased or flawed or “silly”. These doctors are not liars or enemies of truth or anything. They’re normal people whose search for the truth is being hijacked in ways they can’t control.
The solution: turn them into rationalists. They don’t have to be black belt rationalists who can derive Bayes’ Theorem in their sleep, but they have to be rationalist enough that their natural good intentions towards truth-seeking correspond to actual truth-seeking and allow you to build your institutions without interference.
“The solution: turn them into rationalists.”
You don’t say how to accomplish this. Would it require (or at least benefit greatly from) institutional change?
I had in mind that you might convince someone abstractly to support eg prediction markets because they promote truth, and then they would accept the results of such markets even if it disagreed with their intuitions. They don’t have to know how to bet well in such markets to accept that they are a better truth-seeking institution. But yes, being a truth-seeker can be very different from believing that you are one.
Btw, I only just discovered the “inbox” that lets me find responses to my comments.
This sounds like you’re postulating people who have good taste in rationalist institutions without having good taste in rationality. Or you’re postulating that it’s easy to push on the former quantity without pushing on the latter. How likely is this really? Why wouldn’t any such effort be easily hijacked by institutions that look good to non-rationalists?
Eliezer, to the extent that any epistemic progress has been made at all, was it not ever thus?
To give one example: the scientific method is an incredibly powerful tool for generating knowledge, and has been very widely accepted as such for the past two centuries.
But even a cursory reading of the history of science reveals that scientists themselves, despite having great taste in rationalist institutions, often had terrible taste in personal rationality. They were frequently petty, biased, determined to believe their own theories regardless of evidence, defamatory and aggressive towards rival theorists, etc.
Ultimately, their taste in rational institutions coexisted with frequent lack of taste in personal rationality (certainly, a lack of Eliezer-level taste in personal rationality). It would have been better, no doubt, if they had had both tastes. But they didn’t. But in the end, it wasn’t necessary that they did.
I would also make some other points:
1. People tend to have stronger emotive attachments—and hence stronger biases—in relation to concrete issues (e.g. “is the theory I believe correct”) than epistemic institutions (e.g. “should we do an experiment to confirm the theory”). One reason is that such object level issues tend to be more politicised. Another is that they tend to have a more direct, concrete impact on individual lives (N.B. the actual impact of epistemic institutions is probably much greater, but for triggering our biases, the appearance of direct action is more important (cf thought experiments about sacrificing a single identifiable child to save faceless millions)).
2. Even very object-level biased people can be convinced to follow the same institutional epistemic framework. After all, if they are convinced that the framework is a truth-productive one, they will believe it will ultimately vindicate their theory. I think this is a key reason why competing ideologies agree to free speech, why competing scientists agree to the scientific method, why (by analogy) competing companies agree to free trade, etc.
[The question of what happens when one person’s theory begins to lose out under the framework is a different one, but by that stage, if enough people are following the epistemic framework, opting out may be socially impossible (e.g. if a famous scientist said “my theory has been falsified by experiment, so I am abandoning the scientific method!”, they would be a laughing stock)]
3. I really worry that “everyone on Earth is irrational, apart from me and my mates” is an incredibly gratifying and tempting position to hold. The romance of the lone point of light in an ocean of darkness! The drama of leading the fight to begin civilisation itself! The thrill of the hordes of Dark Side Epistemologists, surrounding the besieged outpost of reason! Who would not be tempted? I certainly am. But that is why I suspect.