I feel that the framing of “System 2 fixing System 1” is what leads to the valley of bad rationality. System 1 gives important feedback, has unique skills, and is most of what we are.
Agreed that System 1 gives important feedback, has unique skills, and is most of what we are, and there have been some good ideas expressed on LW about that. I would in addition suggest that System 2 is best set up as a means to evaluate System 1, see where it is doing well and where it needs improvement, and then providing improvement as needed. Thoughts?
A strict hierarchy where System 2 always wins in a direct confrontation because it is better at logical arguments is bad. It is bad because it is going to make the parts of you that have desires but don’t know how to express them with rigorous logical arguments feel bad.
If we’re using the elephant and rider model, it is trying to aim for a harmonious relationship rather than one where the rider has to poke the elephant with a sharp stick, and has general disdain for it.
I believe my article above did not convey the idea that we should always go by System 2, as that is not a wise move for the reasons I outlined above. I do strongly believe that we should use System 2 thinking to examine the parts of ourselves that have strong desires but don’t know how to express them, evaluate whether these desires are beneficial to oneself, and change those parts that are not beneficial, for instance using Dark Arts of Rationality on oneself.
Sometimes nurses have an intuition that a particular patient might have an issue in the coming day but the nurse has no evidence for the patient getting an issue to make a logical argument.
Research suggests that in those situation it’s better to put the patient under extra monitoring then to do nothing.
Fireman who feel unexplainable fear in a building should get out as soon as possible.
In general when there a high cost of ignoring a justified fear but a low cost of following it, it’s good to accept the fear even if you don’t have a logical reason.
There are also a bunch of cases where the literature suggests that people making decision via unconscious thoughts do as well or better than people who undergo conscious deliberation.
Yup, that makes a lot of sense, agreed on the usefulness of taking in all signals. Intuition can be very useful indeed. I’d also say that intuition would benefit by being occasionally evaluated by System 2 to see if it can be improved so that we can have more effective roles in our daily activities. Your thoughts?
The way I tend to do it is to sit down once a week, and evaluate my current cached habits, thoughts, beliefs, roles, etc—the whole complex of factors that makes up what I perceive as “intuition” for me. I see if they are serving me well, or not. If I find they are not serving me well, I strive to change them.
Thoughts and beliefs are system II stuff. If you do X because you believe Y, that’s system II.
Intuition is often when you do things and have no reason for reasons that can’t be expressed in language as such it’s a lot harder to investigate. The kind of intuition that lets a fireman feel fear when he can’t see any logical reason and then quit the building to safe his life, can’t be broken down analytically.
I don’t see a good reason to speak in terms of system I and system II when speaking about sitting down to retrospect.
I don’t think you understand what I’m getting at. You seem to still be positing that System 2 is the “adult” that will discard parts of System 1 when they are not to the advantage of goals System 2 has.
I would in addition suggest that System 2 is best set up as a means to evaluate System 1, see where it is doing well and where it needs improvement
System 2 can be used to improve System 1. Noticing confusion on the other hand is an example where you use system 1 to intervene in a System 2 process.
Good point, agreed on noticing confusion. In my experience, I had to train my System 1 to develop a habit to notice confusion, so first I used System 2 to improve System 1, and then let System 1 intervene in System 2. What was your experience here?
When I have disagreements with myself or when I’m trying to seize agency for myself, it’s as though this intuition bumps into that one while they both rest on the surface of this one, which is actually a mix of two earlier intuitions that aren’t quite settled together yet, and they are surrounded by more fluid less well defined intuitions which determine the specifics of their collusion and their movement post-interaction, that sort of thing. It’s a very bottom up process, not hierarchical or commanded at all, so it isn’t adequately described by appealing to System 2.
This is one reason why there have been discussions about the need to re-evaluate dual process theory and have a more complex understanding of rationality and intelligence. I think the concept of “domains of agency” provides one way of enriching the current conversation, but there are many others as well, such as what you describe about the disagreement with oneself. That might be a good topic to post about in the Less Wrong Discussion thread.
I generally use System 1 in a System 2 like way when I have disagreements with myself or when I’m trying to seize agency. It’s easier thinking about it as though it’s a physical system. This intuition bumps into that one as they both rest on the surface of this one, which is actually a mix of two earlier intuitions that aren’t quite perfectly resolved with one another, and they are surrounded by more fluid intuitions that are less well defined which determine their interactions and their movement after they interact, that sort of thing. It’s a very bottom up process, thus I don’t feel comfortable attributing it to System 2, but at the same time traditional characterizations of System 1 fail to adequately describe it.
I feel that the framing of “System 2 fixing System 1” is what leads to the valley of bad rationality. System 1 gives important feedback, has unique skills, and is most of what we are.
Agreed that System 1 gives important feedback, has unique skills, and is most of what we are, and there have been some good ideas expressed on LW about that. I would in addition suggest that System 2 is best set up as a means to evaluate System 1, see where it is doing well and where it needs improvement, and then providing improvement as needed. Thoughts?
A strict hierarchy where System 2 always wins in a direct confrontation because it is better at logical arguments is bad. It is bad because it is going to make the parts of you that have desires but don’t know how to express them with rigorous logical arguments feel bad.
If we’re using the elephant and rider model, it is trying to aim for a harmonious relationship rather than one where the rider has to poke the elephant with a sharp stick, and has general disdain for it.
I believe my article above did not convey the idea that we should always go by System 2, as that is not a wise move for the reasons I outlined above. I do strongly believe that we should use System 2 thinking to examine the parts of ourselves that have strong desires but don’t know how to express them, evaluate whether these desires are beneficial to oneself, and change those parts that are not beneficial, for instance using Dark Arts of Rationality on oneself.
The Elephant and Rider model I used above corresponds to previous discussions on LW about this topic, do you disagree with those discussions?
Sometimes nurses have an intuition that a particular patient might have an issue in the coming day but the nurse has no evidence for the patient getting an issue to make a logical argument.
Research suggests that in those situation it’s better to put the patient under extra monitoring then to do nothing.
Fireman who feel unexplainable fear in a building should get out as soon as possible.
In general when there a high cost of ignoring a justified fear but a low cost of following it, it’s good to accept the fear even if you don’t have a logical reason.
There are also a bunch of cases where the literature suggests that people making decision via unconscious thoughts do as well or better than people who undergo conscious deliberation.
http://www.ncbi.nlm.nih.gov/pubmed/20228284 is for example a study suggests that diagnosis of psychiatric cases works better via unconscious thinking than conscious thinking.
Yup, that makes a lot of sense, agreed on the usefulness of taking in all signals. Intuition can be very useful indeed. I’d also say that intuition would benefit by being occasionally evaluated by System 2 to see if it can be improved so that we can have more effective roles in our daily activities. Your thoughts?
What do you mean when you say “evaluate intuition” in practical terms?
The way I tend to do it is to sit down once a week, and evaluate my current cached habits, thoughts, beliefs, roles, etc—the whole complex of factors that makes up what I perceive as “intuition” for me. I see if they are serving me well, or not. If I find they are not serving me well, I strive to change them.
Thoughts and beliefs are system II stuff. If you do X because you believe Y, that’s system II.
Intuition is often when you do things and have no reason for reasons that can’t be expressed in language as such it’s a lot harder to investigate. The kind of intuition that lets a fireman feel fear when he can’t see any logical reason and then quit the building to safe his life, can’t be broken down analytically.
I don’t see a good reason to speak in terms of system I and system II when speaking about sitting down to retrospect.
I don’t think you understand what I’m getting at. You seem to still be positing that System 2 is the “adult” that will discard parts of System 1 when they are not to the advantage of goals System 2 has.
System 2 can be used to improve System 1. Noticing confusion on the other hand is an example where you use system 1 to intervene in a System 2 process.
Good point, agreed on noticing confusion. In my experience, I had to train my System 1 to develop a habit to notice confusion, so first I used System 2 to improve System 1, and then let System 1 intervene in System 2. What was your experience here?
I agree
When I have disagreements with myself or when I’m trying to seize agency for myself, it’s as though this intuition bumps into that one while they both rest on the surface of this one, which is actually a mix of two earlier intuitions that aren’t quite settled together yet, and they are surrounded by more fluid less well defined intuitions which determine the specifics of their collusion and their movement post-interaction, that sort of thing. It’s a very bottom up process, not hierarchical or commanded at all, so it isn’t adequately described by appealing to System 2.
This is one reason why there have been discussions about the need to re-evaluate dual process theory and have a more complex understanding of rationality and intelligence. I think the concept of “domains of agency” provides one way of enriching the current conversation, but there are many others as well, such as what you describe about the disagreement with oneself. That might be a good topic to post about in the Less Wrong Discussion thread.
I generally use System 1 in a System 2 like way when I have disagreements with myself or when I’m trying to seize agency. It’s easier thinking about it as though it’s a physical system. This intuition bumps into that one as they both rest on the surface of this one, which is actually a mix of two earlier intuitions that aren’t quite perfectly resolved with one another, and they are surrounded by more fluid intuitions that are less well defined which determine their interactions and their movement after they interact, that sort of thing. It’s a very bottom up process, thus I don’t feel comfortable attributing it to System 2, but at the same time traditional characterizations of System 1 fail to adequately describe it.