For most people, most of the things they want do in fact prefer some ways of thinking, so your definition requires us to consider a counterfactual pretty far from ordinary experience. In contrast, defining in terms of accuracy-seeking is simple and accessible. If this site is going to use the word “rational” a lot, we’d better have a simple clear definition or we’ll be arguing this definitional stuff endlessly.
I usually define “rationality” as accuracy-seeking whenever decisional considerations do not enter. These days I sometimes also use the phrase “epistemic rationality”.
It would indeed be more complicated if we began conducting the meta-argument that (a) an ideal Bayesian not faced with various vengeful gods inspecting its algorithm should not decide to rewrite its memories to something calibrated away from what it originally believed to be accurate, or that (b) human beings ought to seek accuracy in a life well-lived according to goals that include both explicit truth-seeking and other goals not about truth.
But unless I’m specifically focused on this argument, I usually go so far as to talk as if it resolves in favor of epistemic accuracy, that is, that pragmatic rationality is unified with epistemic rationality rather than implying two different disciplines. If truth is a bad idea, it’s not clear what the reader is doing on Less Wrong, and indeed, the “pragmatic” reader who somehow knows that it’s a good idea to be ignorant, will at once flee as far as possible...
You started off using the word “rationality” on this blog/forum, and though I had misgivings, I tried to continue with your language. But most of the discussion of this post seems to be distracted by my having tried to clarify that in the introductory sentence. I predict we won’t be able to get past this, and so from now on I will revert to my usual policy of avoiding overloaded words like “rationality.”
If truth is a bad idea, it’s not clear what the reader is doing on Less Wrong [...]
Believing the truth is usually a good idea—for real organisms.
However, I don’t think rationality should be defined in terms of truth seeking. For one thing, that is not particularly conventional usage. For another, it seems like a rather arbitrary goal. What if a Buddhist claims that rational behaviour typically involves meditating until you reach nirvana. On what grounds would that claim be dismissed? That seems to me to be an equally biologically realistic goal.
I think that convention has it right here—the details of the goal are irrelevances to rationality which should be factored right out of the equation. You can rationally pursue any goal—without any exceptions.
I’m confused by the phrase “most of the things they want do in fact prefer some ways of thinking”.
I thought that EY was saying that he requires goals like “some hot chocolate” or “an interesting book”, rather than goals like: “the answer to this division problem computed by the Newton-Raphson algorithm”
For most people, most of the things they want do in fact prefer some ways of thinking, so your definition requires us to consider a counterfactual pretty far from ordinary experience. In contrast, defining in terms of accuracy-seeking is simple and accessible. If this site is going to use the word “rational” a lot, we’d better have a simple clear definition or we’ll be arguing this definitional stuff endlessly.
I usually define “rationality” as accuracy-seeking whenever decisional considerations do not enter. These days I sometimes also use the phrase “epistemic rationality”.
It would indeed be more complicated if we began conducting the meta-argument that (a) an ideal Bayesian not faced with various vengeful gods inspecting its algorithm should not decide to rewrite its memories to something calibrated away from what it originally believed to be accurate, or that (b) human beings ought to seek accuracy in a life well-lived according to goals that include both explicit truth-seeking and other goals not about truth.
But unless I’m specifically focused on this argument, I usually go so far as to talk as if it resolves in favor of epistemic accuracy, that is, that pragmatic rationality is unified with epistemic rationality rather than implying two different disciplines. If truth is a bad idea, it’s not clear what the reader is doing on Less Wrong, and indeed, the “pragmatic” reader who somehow knows that it’s a good idea to be ignorant, will at once flee as far as possible...
You started off using the word “rationality” on this blog/forum, and though I had misgivings, I tried to continue with your language. But most of the discussion of this post seems to be distracted by my having tried to clarify that in the introductory sentence. I predict we won’t be able to get past this, and so from now on I will revert to my usual policy of avoiding overloaded words like “rationality.”
If truth is a bad idea, it’s not clear what the reader is doing on Less Wrong [...]
Believing the truth is usually a good idea—for real organisms.
However, I don’t think rationality should be defined in terms of truth seeking. For one thing, that is not particularly conventional usage. For another, it seems like a rather arbitrary goal. What if a Buddhist claims that rational behaviour typically involves meditating until you reach nirvana. On what grounds would that claim be dismissed? That seems to me to be an equally biologically realistic goal.
I think that convention has it right here—the details of the goal are irrelevances to rationality which should be factored right out of the equation. You can rationally pursue any goal—without any exceptions.
I’m confused by the phrase “most of the things they want do in fact prefer some ways of thinking”.
I thought that EY was saying that he requires goals like “some hot chocolate” or “an interesting book”, rather than goals like: “the answer to this division problem computed by the Newton-Raphson algorithm”