This highlights all the difficulties even making sense of a notion of rationality (I believe believing truth is well defined but that there is no relation one can define between OBSERVATIONS and BELIEFS that corresponds to our intuitive notion of rationality).
In particular your definition of rational seems to be something about satisfying the most goals or some other act based notion of rationality (not merely the attempt to believe the most truths). However, this creates several natural questions. First, if you would change your goals if you had sufficient time and were a clear enough thinker then does it still count as rational in your sense to achieve them? If so, then you end up with the strange result that you probably SHOULDN’T spend too much time thinking about your goals or otherwise trying to improve your rationality. After all, it is reasonably likely that your goals would change if subject to sufficient consideration and if you do manage to end up changing those goals you now almost certainly won’t achieve the original goals (which are what is rational to achieve) while contemplation and attempts to improve the clearness of your thinking probably don’t offer enough practical benefit to make it on net more likely you will achieve your original goals. This result seems ridiculous and in deep conflict with the idea of rationality.
Alternatively, if the mere fact that with enough clear-eyed reflection you would change your goals means that rational action is the action most likely to achieve the goals you would adopt with enough reflection rather than the goals you would adopt without it. This too leads to absurdities.
Suppose (as I was until recently) I’m a mathematician and I’m committed to solving some rather minor but interesting problem in my field. I don’t consciously realise that I adopted that goal because it is the most impressive thing in my field that I haven’t rejected as infeasible but that correctly describes my actual dispositions, i.e., if I discover that some other far more impressive result is something I can prove than I will switch over to wanting to do that. Now, almost certainly there is at least one open problem in my field that is considered quite hard but actually has some short clever proof but since I currently don’t know what it is every problem considered quite hard is something I am inclined to think is impractical.
However, since rationality is defined as those acts which increase the likelihood that I will achieve the goal I would have had IF I spent arbitrarily long clearheadedly contemplating my goals and given enough time I can consider every short proof it follows that I ACT RATIONALLY WHENEVER MY ACTIONS MAKE ME MORE LIKELY TO SOLVE THE HARD MATH PROBLEM IN MY FIELD THAT HAPPENS TO HAVE AN OVERLOOKED SHORT PROOF EVEN THOUGH I HAVE NO REASON TO PURSUE THAT PROBLEM CURRENTLY. In other words I end up being rational just when I do something that is intuitively deeply irrational, i.e., for no discernable reason happen to ignore all the evidence that suggests the problem is hard and happen to switch to working on it.
This isn’t merely an issue for act rationality as discussed here but also for belief rationality. Intuitively, belief rationality is something that should help me believe true things. Now ask whether it is more rational to believe, as all the current evidence suggests, that the one apparently hard but actually fairly easy math problem is hard or easy. If rationality is really about coming to more true beliefs than it is ALWAYS MORE RATIONAL TO RANDOMLY BELIEVE THAT AN EASY MATH PROBLEM IS EASY (OR A PROVABLE MATHEMATICAL STATEMENT IS TRUE) THAN TO BELIEVE WHATEVER THE EVIDENCE SAYS ABOUT THE PROBLEM. Yet, this is in deep conflict with our intuition that rationality should be about behaving in some principled way with respect to the evidence and not making blind leeps of faith.
Ultimately, the problem comes down to the lack of any principled notion of what counts as a rule for decision making. There is no principled way to distinguish the rule that says ‘Believe what the experts in the field and other evidence tells you about the truth of unresolved mathematical statements’ and ‘Believe what the experts in the field and other evidence tells you about the truth of unresolved mathematical statements except for statement p which you should think is true with complete confidence.’ Since the second rule always yields more truthful beliefs than the first it should be more rational to accept it.
This result is clearly incompatible with our intuitive notion of rationality so we are forced to admit the notion itself is flawed.
Note, that you can’t avoid this problem by insisting that we have reasons for adopting one belief over another or anything like this. After all, consider someone whose basic belief formation mechanism didn’t cause them to accept A if A & B was asserted. They are worse off than us in the same way that we were worse off than the person who randomly accepted ZFC → FLT (Fermat’s Last Theorem is true if set theory is true) before wiles provided any proof. There is a brute mathematical fact that each is inclined to infer without further evidence and that always serves to help them reach true beliefs.
I don’t think that the article is saying you should completely abandon terminal goals and truth-seeking altogether. It sounds to me like it’s saying that while in the vast majority of situations it is better to seek truth and not change terminal goals, there are particular circumstances where it is the right thing to do. For instance, if you accidentally saw greater than 50% of the exam answers of a friend who got 100% on a short but important final exam, and you did not have the option of taking a different exam or delaying it or ever taking that exam again, would you intentionally fail the exam? Or would you compartmentalize your knowledge of the answers so that you cannot access it during the exam, so you can take the exam the way you would have if you hadn’t seen your friend’s exam? In this scenario you would probably either have to change your terminal goal that the exam is instrumental to or to intentionally hide your knowledge of the answers from yourself and avoid seeking the truth about them in your own mind at least until the exam is over.
Also, I’m really scared of using these techniques because I have been conditioned not to trust myself at all if I lie to myself. Does it count as compartmentalization to ignore everything I just read here and pretend to myself that I should definitely never lie to myself intentionally, at least until I feel ready to do so without losing a large portion of my sanity and intellectual autonomy? I’m pretty sure already that the answer is yes.
However, I’m kind of new at actively thinking about and asking my own questions about my own thoughts and beliefs. I do not feel like I have observed enough examples of the quality of my reasoning ability to completely counteract the most likely false belief that I should not be intellectually autonomous because relying on my own reasoning ability is more likely to hurt others and myself instead of help.
For most of my life I have been conditioned to believe that, and it has only been very recently that I have started making progress towards eliminating that belief from my mind, rather than compartmentalizing it. I’m worried that using compartmentalization intentionally could significantly interfere with my progress in that regard.
I’m only just managing to hold this problem off right now, and that task is taking more energy and concentration then I think is realistic to be able to allocate to it on a regular basis.
If I tell myself that I don’t need to be honest with myself about myself and my thoughts 100% of the time, and that what matters is that I’m honest with myself about myself and my thoughts most of the time and only dishonest with myself when it’s necessary, then it’s probably going to be disproportionately difficult to trust myself when I test my own honesty with myself and find out that I’m being honest.
Any advice please? I’m rather inexperienced with this level of honest self cognitive analysis (if that’s what it’s called) and I think I might be somewhat out of my league with this problem. Thanks!
From what you’ve told me, I strongly recommend not using any of the techniques I mentioned until you’re much more confident in your mental control.
It seems that we come from very different mental backgrounds (I was encouraged to be intellectually autonomous from a young age), so you should definitely take my suggestions with caution, as it’s likely they won’t work for people with your background.
It sounds to me like you’re in the early stages of taking control over your beliefs, and while it seems you’re on the right track, it doesn’t sound to me like my techniques would be helpful at this juncture.
So I should continue giving my very best effort to be completely honest with myself, and just hope I don’t ever find myself in a catch-22 scenario like the one I just described before I’m ready. Admitting that lying to myself COULD be my best option in particular kinds of situations is not the same as actually being in such a situation and having to take that option. Whew! I was freaking out a bit, worrying that I would have to compartmentalize the information in your article in order to avoid using the techniques in it. Now I realize that was kind of silly of me.
This highlights all the difficulties even making sense of a notion of rationality (I believe believing truth is well defined but that there is no relation one can define between OBSERVATIONS and BELIEFS that corresponds to our intuitive notion of rationality).
In particular your definition of rational seems to be something about satisfying the most goals or some other act based notion of rationality (not merely the attempt to believe the most truths). However, this creates several natural questions. First, if you would change your goals if you had sufficient time and were a clear enough thinker then does it still count as rational in your sense to achieve them? If so, then you end up with the strange result that you probably SHOULDN’T spend too much time thinking about your goals or otherwise trying to improve your rationality. After all, it is reasonably likely that your goals would change if subject to sufficient consideration and if you do manage to end up changing those goals you now almost certainly won’t achieve the original goals (which are what is rational to achieve) while contemplation and attempts to improve the clearness of your thinking probably don’t offer enough practical benefit to make it on net more likely you will achieve your original goals. This result seems ridiculous and in deep conflict with the idea of rationality.
Alternatively, if the mere fact that with enough clear-eyed reflection you would change your goals means that rational action is the action most likely to achieve the goals you would adopt with enough reflection rather than the goals you would adopt without it. This too leads to absurdities.
Suppose (as I was until recently) I’m a mathematician and I’m committed to solving some rather minor but interesting problem in my field. I don’t consciously realise that I adopted that goal because it is the most impressive thing in my field that I haven’t rejected as infeasible but that correctly describes my actual dispositions, i.e., if I discover that some other far more impressive result is something I can prove than I will switch over to wanting to do that. Now, almost certainly there is at least one open problem in my field that is considered quite hard but actually has some short clever proof but since I currently don’t know what it is every problem considered quite hard is something I am inclined to think is impractical.
However, since rationality is defined as those acts which increase the likelihood that I will achieve the goal I would have had IF I spent arbitrarily long clearheadedly contemplating my goals and given enough time I can consider every short proof it follows that I ACT RATIONALLY WHENEVER MY ACTIONS MAKE ME MORE LIKELY TO SOLVE THE HARD MATH PROBLEM IN MY FIELD THAT HAPPENS TO HAVE AN OVERLOOKED SHORT PROOF EVEN THOUGH I HAVE NO REASON TO PURSUE THAT PROBLEM CURRENTLY. In other words I end up being rational just when I do something that is intuitively deeply irrational, i.e., for no discernable reason happen to ignore all the evidence that suggests the problem is hard and happen to switch to working on it.
This isn’t merely an issue for act rationality as discussed here but also for belief rationality. Intuitively, belief rationality is something that should help me believe true things. Now ask whether it is more rational to believe, as all the current evidence suggests, that the one apparently hard but actually fairly easy math problem is hard or easy. If rationality is really about coming to more true beliefs than it is ALWAYS MORE RATIONAL TO RANDOMLY BELIEVE THAT AN EASY MATH PROBLEM IS EASY (OR A PROVABLE MATHEMATICAL STATEMENT IS TRUE) THAN TO BELIEVE WHATEVER THE EVIDENCE SAYS ABOUT THE PROBLEM. Yet, this is in deep conflict with our intuition that rationality should be about behaving in some principled way with respect to the evidence and not making blind leeps of faith.
Ultimately, the problem comes down to the lack of any principled notion of what counts as a rule for decision making. There is no principled way to distinguish the rule that says ‘Believe what the experts in the field and other evidence tells you about the truth of unresolved mathematical statements’ and ‘Believe what the experts in the field and other evidence tells you about the truth of unresolved mathematical statements except for statement p which you should think is true with complete confidence.’ Since the second rule always yields more truthful beliefs than the first it should be more rational to accept it.
This result is clearly incompatible with our intuitive notion of rationality so we are forced to admit the notion itself is flawed.
Note, that you can’t avoid this problem by insisting that we have reasons for adopting one belief over another or anything like this. After all, consider someone whose basic belief formation mechanism didn’t cause them to accept A if A & B was asserted. They are worse off than us in the same way that we were worse off than the person who randomly accepted ZFC → FLT (Fermat’s Last Theorem is true if set theory is true) before wiles provided any proof. There is a brute mathematical fact that each is inclined to infer without further evidence and that always serves to help them reach true beliefs.
Are you aware of the distinction between epistemic rationality and instrumental rationality? Although “seeking truth” and “achieving goals” can be put at odds, that’s no excuse to throw them both out the window.
I don’t think that the article is saying you should completely abandon terminal goals and truth-seeking altogether. It sounds to me like it’s saying that while in the vast majority of situations it is better to seek truth and not change terminal goals, there are particular circumstances where it is the right thing to do. For instance, if you accidentally saw greater than 50% of the exam answers of a friend who got 100% on a short but important final exam, and you did not have the option of taking a different exam or delaying it or ever taking that exam again, would you intentionally fail the exam? Or would you compartmentalize your knowledge of the answers so that you cannot access it during the exam, so you can take the exam the way you would have if you hadn’t seen your friend’s exam? In this scenario you would probably either have to change your terminal goal that the exam is instrumental to or to intentionally hide your knowledge of the answers from yourself and avoid seeking the truth about them in your own mind at least until the exam is over.
Also, I’m really scared of using these techniques because I have been conditioned not to trust myself at all if I lie to myself. Does it count as compartmentalization to ignore everything I just read here and pretend to myself that I should definitely never lie to myself intentionally, at least until I feel ready to do so without losing a large portion of my sanity and intellectual autonomy? I’m pretty sure already that the answer is yes.
However, I’m kind of new at actively thinking about and asking my own questions about my own thoughts and beliefs. I do not feel like I have observed enough examples of the quality of my reasoning ability to completely counteract the most likely false belief that I should not be intellectually autonomous because relying on my own reasoning ability is more likely to hurt others and myself instead of help.
For most of my life I have been conditioned to believe that, and it has only been very recently that I have started making progress towards eliminating that belief from my mind, rather than compartmentalizing it. I’m worried that using compartmentalization intentionally could significantly interfere with my progress in that regard.
I’m only just managing to hold this problem off right now, and that task is taking more energy and concentration then I think is realistic to be able to allocate to it on a regular basis.
If I tell myself that I don’t need to be honest with myself about myself and my thoughts 100% of the time, and that what matters is that I’m honest with myself about myself and my thoughts most of the time and only dishonest with myself when it’s necessary, then it’s probably going to be disproportionately difficult to trust myself when I test my own honesty with myself and find out that I’m being honest.
Any advice please? I’m rather inexperienced with this level of honest self cognitive analysis (if that’s what it’s called) and I think I might be somewhat out of my league with this problem. Thanks!
From what you’ve told me, I strongly recommend not using any of the techniques I mentioned until you’re much more confident in your mental control.
It seems that we come from very different mental backgrounds (I was encouraged to be intellectually autonomous from a young age), so you should definitely take my suggestions with caution, as it’s likely they won’t work for people with your background.
It sounds to me like you’re in the early stages of taking control over your beliefs, and while it seems you’re on the right track, it doesn’t sound to me like my techniques would be helpful at this juncture.
So I should continue giving my very best effort to be completely honest with myself, and just hope I don’t ever find myself in a catch-22 scenario like the one I just described before I’m ready. Admitting that lying to myself COULD be my best option in particular kinds of situations is not the same as actually being in such a situation and having to take that option. Whew! I was freaking out a bit, worrying that I would have to compartmentalize the information in your article in order to avoid using the techniques in it. Now I realize that was kind of silly of me.
Thanks for your help!