Perhaps it would be best to learn from psychology. Psychology has shown that there’s very little you can do to make yourself ‘more rational.’ Knowing about biases does little to prevent them from happening, and you can’t force yourself to enjoy something you don’t enjoy. Further, it takes a lot of conscious, slow effort to be rational. In the face of real-life problems, true rationality is often pretty much impossible as it would take more computing power than available in the universe. It’s pretty clear that our irrationality is a mechanism to cope with the information overload of the real world by making approximate guesses.
It’s because of things like this that I think maybe LW has gone severely overboard with the instrumental rationality thing. Note that knowing about biases is a noble goal that we should strive towards, but trying to fix them often backfires. The best we can usually hope for is to try to identify biases in our thinking and other people’s.
But anyway, a lot of the issues of this site could simply be a matter of technical fixes. It was never really a good idea to base a rationality forum on a reddit template. Instead of the ‘everyone gets to vote’ system, I prefer the system where there are a handful of moderators. Moderators could be selected by the community and they would not be allowed to moderate discussions they themselves are participating in. This is the system that slashdot follows and I think it seems to work extremely well.
you can’t force yourself to enjoy something you don’t enjoy
This particular point is demonstrably false, at least as a general one: people acquire taste for foods and activities they previously disliked all the time.
Knowing about biases does little to prevent them from happening
There are plenty of (anecdotal) examples to the contrary. I find myself thinking something like “am I being biased in assuming...” all the time, now that I have been on this forum for years. I heard similar sentiments from others, as well.
it takes a lot of conscious, slow effort to be rational
That’s true enough. But it is also true in general for almost every System 2-type activity (like learning to drive), until it gets internalized in System 1.
In the face of real-life problems, true rationality is often pretty much impossible as it would take more computing power than available in the universe.
Indeed it is impossible to get a perfectly optimal solution, and one of the biases is the proverbial “analysis paralysis”, where an excuse for doing nothing is that anything you do is suboptimal. However, an essential part of being instrumentally rational is figuring out the right amount of computing power to dedicate to a particular problem before acting.
a lot of the issues of this site could simply be a matter of technical fixes
Indeed a different template could have worked better. Who knows. However, a decision had to be made within the time and budget constraints, and, while suboptimal, it was good enough to let the site thrive. See above about bounded rationality.
This is the system that slashdot follows and I think it seems to work extremely well.
Except Reddit is clearly winning, in the “rationalists must win” sense, and Slashdot has all but disappeared, or at least has been severely marginalized compared to its late 90s heydays .
This particular point is demonstrably false, at least as a general one: people acquire taste for foods and activities they previously disliked all the time.
I’ve done this a lot. Each time I did, it wasn’t because I forced myself, it was because I saw some new attractive thing in those foods or activities that I didn’t see before. Perception and enjoyment aren’t constant. People are more likely to try new activities when they are in a good mood (for instance). Mood alters perception. In that sense I actually agree with Villiam_Bur. You can get more people to become ‘rationalists’ through engaging and fun activities. But you have to ask yourself what the ultimate goal is and if it can succeed for making people more rational.
However, an essential part of being instrumentally rational is figuring out the right amount of computing power to dedicate to a particular problem before acting.
The most powerful ‘subsystem’ in the brain is the subconscious system 1 part. This is the part that can bring the most computational power to bear on a problem. Making an effort to focus your system 2 cognition on solving a problem (rather than simply doing what comes instinctively) can backfire. But it gets worse. There’s no ‘system monitor’ for the brain. And even if there was, if you go even more meta, optimizing resource allocation for solving problem X may itself be a much harder problem than solving X using the first method that comes to mind.
Except Reddit is clearly winning, in the “rationalists must win” sense, and Slashdot has all but disappeared, or at least has been severely marginalized compared to its late 90s heydays .
I know it’s an extremely subjective opinion, but it seems to me that the slashdot system reduces spread of misinformation and reduces downvote fights (and overall flamewars). As for why slashdot has shrunk as a community, I suppose it’s partly because reddit has grown, and reddit seems to have grown because of the ‘digg exodus’ (largely self-inflicted by digg) and the subreddit idea. Remember that there used to be many news aggregators (like digg) that have all but disappeared.
The idea here shouldn’t be “let’s adopt the most popular forum system”, it should be “let’s adopt the forum system that is most conducive to the goals of the community.” And we have at least one important data point (Eliezer) indicating the contrary.
The idea here shouldn’t be “let’s adopt the most popular forum system”, it should be “let’s adopt the forum system that is most conducive to the goals of the community.”
Disregarding your use of the word “community” for what’s best described as an online social club, who’s to say that we’re not doing this already? The “forum system that is most conducive” to our goals might well be a combination of one very open central site (LessWrong itself) supplemented by a variety of more private sites that discuss rationality in different ways, catering to a variety of niches. Not just Eliezer’s Facebook page, but including things like MoreRight, Yvain’s blog, Overcoming Bias, Give Well etc.
The “forum system that is most conducive” to our goals might well be a combination of one very open central site (LessWrong itself) supplemented by a variety of more private sites that discuss rationality in different ways, catering to a variety of niches. Not just Eliezer’s Facebook page, but including things like MoreRight, Yvain’s blog, Overcoming Bias, Give Well etc.
This makes me a little suspicious as a solution, only because there doesn’t seem to be anything particularly special about it besides being precisely the system that is already in place.
Because, y’know, communities actually exist, like, in the real world. More relevantly, they have a fairly important goal in protecting real, actual people from bodily harm and providing a nurturing environment for them to thrive in. Since this does not apply to virtual, Internet sites, calling them “communities” is quite misleading and can have bad side-effects if the metaphor is taken seriously, either by accident or through sneaking connotations. So I think it’s better if folks are sometimes encouraged to taboo this particular term.
you can’t force yourself to enjoy something you don’t enjoy
Perhaps “force” isn’t the right approach (and the whole “willpower” is just a red herring). But don’t we have many examples where people changed their emotions because of an external influence? Charismatic people can motivate others. People sometimes like something because their friends like it. Conditioning.
I believe with a strategic approach people can make themselves enjoy something more. It may not be fast or 100% reliable or sufficiently cheap, but there is a way. A rational person should try finding the best way to enjoy something, if enjoying that thing is desirable. (For example, people from Vienna meetup are going to gym together after the next meetup, so they can convert enjoying a rationalist community into enjoying exercise.)
Charismatic people can motivate others. People sometimes like something because their friends like it. Conditioning.
Now that’s slightly better, and I agree. But again, you have to ask yourself what the ultimate purpose is and if it’s going to backfire or not.
For example, people from Vienna meetup are going to gym together after the next meetup, so they can convert enjoying a rationalist community into enjoying exercise.
That sounds like an interesting idea, if perhaps slightly naive. I get what the goal is: Channel the enjoyment of a rationality meeting to start exercising, then hope that after a while the enjoyment of exercise will itself act as a positive feedback loop. But then you have to ask the question: Why weren’t they already exercising in the first place? And if they hope to achieve something positive by exercising, wasn’t that enough to get them start exercising? It’s possible that after the initial good feelings wear off (“Yay, the rationality community is exercising together!”) the root causes of exercise avoidance will kick in again and dissolve the entire idea. Or worse: get them to do extremely unenjoyable exercises just for the sake of the community, which will ultimately get them to resent exercise even more than before.
Why weren’t they already exercising in the first place? And if they hope to achieve something positive by exercising, wasn’t that enough to get them start exercising?
I think that humans usually are not strategic goal seekers. That’s how an ideal rational being should be, but ordinary humans are not like that. We do have goals, and sometimes even strategies, but most things are decided emotionally or by habit.
So the answer to “why weren’t they already exercising” could well be: a) Because they didn’t have a habit of exercising. When you are doing something for the first time, there is a lot of logistic overhead; you must decide when and where to exercise, which specific exercises are you going to do, et cetera; while the next time you can simply decide to do the same thing you did yesterday. b) Because they didn’t have positive memories connected with exercising in past, so while their heads are thinking that it would be good to exercise and become more fit and healthy, their hearts try to avoid the whole thing.
If this model is correct (well, that’s questionable, but I suppose it is) the next time there is an advantage that you can follow the strategy of doing the same thing as the last time, and you already have some positive memories. And this could be enough for some people to change the balance. And may be not enough for others. In this specific case, we will later have experimental data.
Speaking for myself, many people I know who exercise or do sport regularly, do it with their friends. If those were my friends, I would be also tempted to join. But I am rather picky about choosing my friends. And the people who pass my filter are usually just as lazy as I am, or too individualistic do agree on doing something together. A few times I went to gym, it was incredibly boring. (I imagine having there someone to talk with would change that. Or if I would just remember to always bring a music player, perhaps with an audio book.) I do some small exercise at home. I imagine that if I had an exercise machine at home, I would use it, because the largest inconvenience for me is to go somewhere outside.
get them to do extremely unenjoyable exercises just for the sake of the community, which will ultimately get them to resent exercise even more than before
That would be obviously wrong, I agree. I just don’t expect this to happen. But it is better to mention it explicitly.
Psychology has shown that there’s very little you can do to make yourself ‘more rational.’
Citation needed.
Not to mention that what an average person can or can not do isn’t particularly illuminating for non-representative subsets like LW.
maybe LW has gone severely overboard with the instrumental rationality thing
I am not sure that is possible. Instrumental rationality is just making sure that what you are doing is useful in getting to wherever you want to go. What does “severely overboard” mean in this context?
‘as the scientists note, “people who were aware of their own biases were not better able to overcome them.” This finding wouldn’t surprise Kahneman, who admits in “Thinking, Fast and Slow” that his decades of groundbreaking research have failed to significantly improve his own mental performance. “My intuitive thinking is just as prone to overconfidence, extreme predictions, and the planning fallacy”—a tendency to underestimate how long it will take to complete a task—“as it was before I made a study of these issues,” he writes. ’
Not to mention that what an average person can or can not do isn’t particularly illuminating for non-representative subsets like LW.
In fact it is; there is no substantial difference when it comes to trying to control biases between highly educated and non-educated people.
I am not sure that is possible. Instrumental rationality is just making sure that what you are doing is useful in getting to wherever you want to go. What does “severely overboard” mean in this context?
There is nothing wrong with ‘making sure that what you are doing is useful in getting to wherever you want to go’. The problem is the idea of trying to ‘fix’ your behavior through self-imposed procedures, trial & error, and self-reporting. Experience shows that this often backfires, as I said. It’s pretty amazing that “I tried method X, and it seemed to work well, I suggest you try it!” (look at JohnMaxwellIV’s comment below for just one example) is taken as constructive information on a site dedicated to rationality.
First, rationality is considerably more than just adjusting for biases.
Second, in your quote Kahneman says (emphasis mine): “My intuitive thinking is just as prone...”. The point isn’t that your System 1 changes much, the point is that your System 2 knows what to look for and compensates as best as it can.
In fact it is; there is no substantial difference when it comes to trying to control biases between highly educated and non-educated people.
Sigh. Citation needed.
The problem is the idea of trying to ‘fix’ your behavior through self-imposed procedures, trial & error, and self-reporting.
And what it the problem, exactly? I am also not sure what the alternative is. Do you want to just assume your own behaviour is immutable? Magically determined without you being able to do anything about it? Do you think you need someone else to change your behaviour for you? What?
‘as the scientists note, “people who were aware of their own biases were not better able to overcome them.” This finding wouldn’t surprise Kahneman, who admits in “Thinking, Fast and Slow” that his decades of groundbreaking research have failed to significantly improve his own mental performance. “My intuitive thinking is just as prone to overconfidence, extreme predictions, and the planning fallacy”—a tendency to underestimate how long it will take to complete a task—“as it was before I made a study of these issues,” he writes. ’
I’m not talking about the bias blind spot. I agree that more educated people are better able to discern biases in their own thoughts and others. In fact that’s exactly what I said, not once but two times.
I’m talking about the ability to control one’s own biases.
I agree that more educated people are better able to discern biases in their own thoughts and others...I’m talking about the ability to control one’s own biases.
Huh? So what are more intelligence—and more educated—people doing, exactly, if not controlling their biases?
Perhaps it would be best to learn from psychology. Psychology has shown that there’s very little you can do to make yourself ‘more rational.’ Knowing about biases does little to prevent them from happening, and you can’t force yourself to enjoy something you don’t enjoy. Further, it takes a lot of conscious, slow effort to be rational. In the face of real-life problems, true rationality is often pretty much impossible as it would take more computing power than available in the universe. It’s pretty clear that our irrationality is a mechanism to cope with the information overload of the real world by making approximate guesses.
It’s because of things like this that I think maybe LW has gone severely overboard with the instrumental rationality thing. Note that knowing about biases is a noble goal that we should strive towards, but trying to fix them often backfires. The best we can usually hope for is to try to identify biases in our thinking and other people’s.
But anyway, a lot of the issues of this site could simply be a matter of technical fixes. It was never really a good idea to base a rationality forum on a reddit template. Instead of the ‘everyone gets to vote’ system, I prefer the system where there are a handful of moderators. Moderators could be selected by the community and they would not be allowed to moderate discussions they themselves are participating in. This is the system that slashdot follows and I think it seems to work extremely well.
This particular point is demonstrably false, at least as a general one: people acquire taste for foods and activities they previously disliked all the time.
There are plenty of (anecdotal) examples to the contrary. I find myself thinking something like “am I being biased in assuming...” all the time, now that I have been on this forum for years. I heard similar sentiments from others, as well.
That’s true enough. But it is also true in general for almost every System 2-type activity (like learning to drive), until it gets internalized in System 1.
Indeed it is impossible to get a perfectly optimal solution, and one of the biases is the proverbial “analysis paralysis”, where an excuse for doing nothing is that anything you do is suboptimal. However, an essential part of being instrumentally rational is figuring out the right amount of computing power to dedicate to a particular problem before acting.
Indeed a different template could have worked better. Who knows. However, a decision had to be made within the time and budget constraints, and, while suboptimal, it was good enough to let the site thrive. See above about bounded rationality.
Except Reddit is clearly winning, in the “rationalists must win” sense, and Slashdot has all but disappeared, or at least has been severely marginalized compared to its late 90s heydays .
I’ve done this a lot. Each time I did, it wasn’t because I forced myself, it was because I saw some new attractive thing in those foods or activities that I didn’t see before. Perception and enjoyment aren’t constant. People are more likely to try new activities when they are in a good mood (for instance). Mood alters perception. In that sense I actually agree with Villiam_Bur. You can get more people to become ‘rationalists’ through engaging and fun activities. But you have to ask yourself what the ultimate goal is and if it can succeed for making people more rational.
The most powerful ‘subsystem’ in the brain is the subconscious system 1 part. This is the part that can bring the most computational power to bear on a problem. Making an effort to focus your system 2 cognition on solving a problem (rather than simply doing what comes instinctively) can backfire. But it gets worse. There’s no ‘system monitor’ for the brain. And even if there was, if you go even more meta, optimizing resource allocation for solving problem X may itself be a much harder problem than solving X using the first method that comes to mind.
I know it’s an extremely subjective opinion, but it seems to me that the slashdot system reduces spread of misinformation and reduces downvote fights (and overall flamewars). As for why slashdot has shrunk as a community, I suppose it’s partly because reddit has grown, and reddit seems to have grown because of the ‘digg exodus’ (largely self-inflicted by digg) and the subreddit idea. Remember that there used to be many news aggregators (like digg) that have all but disappeared.
The idea here shouldn’t be “let’s adopt the most popular forum system”, it should be “let’s adopt the forum system that is most conducive to the goals of the community.” And we have at least one important data point (Eliezer) indicating the contrary.
Disregarding your use of the word “community” for what’s best described as an online social club, who’s to say that we’re not doing this already? The “forum system that is most conducive” to our goals might well be a combination of one very open central site (LessWrong itself) supplemented by a variety of more private sites that discuss rationality in different ways, catering to a variety of niches. Not just Eliezer’s Facebook page, but including things like MoreRight, Yvain’s blog, Overcoming Bias, Give Well etc.
This makes me a little suspicious as a solution, only because there doesn’t seem to be anything particularly special about it besides being precisely the system that is already in place.
What do you see as being the distinction between a “community” and a mere “online social club”? Genuinely confused.
Because, y’know, communities actually exist, like, in the real world. More relevantly, they have a fairly important goal in protecting real, actual people from bodily harm and providing a nurturing environment for them to thrive in. Since this does not apply to virtual, Internet sites, calling them “communities” is quite misleading and can have bad side-effects if the metaphor is taken seriously, either by accident or through sneaking connotations. So I think it’s better if folks are sometimes encouraged to taboo this particular term.
Perhaps “force” isn’t the right approach (and the whole “willpower” is just a red herring). But don’t we have many examples where people changed their emotions because of an external influence? Charismatic people can motivate others. People sometimes like something because their friends like it. Conditioning.
I believe with a strategic approach people can make themselves enjoy something more. It may not be fast or 100% reliable or sufficiently cheap, but there is a way. A rational person should try finding the best way to enjoy something, if enjoying that thing is desirable. (For example, people from Vienna meetup are going to gym together after the next meetup, so they can convert enjoying a rationalist community into enjoying exercise.)
Now that’s slightly better, and I agree. But again, you have to ask yourself what the ultimate purpose is and if it’s going to backfire or not.
That sounds like an interesting idea, if perhaps slightly naive. I get what the goal is: Channel the enjoyment of a rationality meeting to start exercising, then hope that after a while the enjoyment of exercise will itself act as a positive feedback loop. But then you have to ask the question: Why weren’t they already exercising in the first place? And if they hope to achieve something positive by exercising, wasn’t that enough to get them start exercising? It’s possible that after the initial good feelings wear off (“Yay, the rationality community is exercising together!”) the root causes of exercise avoidance will kick in again and dissolve the entire idea. Or worse: get them to do extremely unenjoyable exercises just for the sake of the community, which will ultimately get them to resent exercise even more than before.
I think that humans usually are not strategic goal seekers. That’s how an ideal rational being should be, but ordinary humans are not like that. We do have goals, and sometimes even strategies, but most things are decided emotionally or by habit.
So the answer to “why weren’t they already exercising” could well be: a) Because they didn’t have a habit of exercising. When you are doing something for the first time, there is a lot of logistic overhead; you must decide when and where to exercise, which specific exercises are you going to do, et cetera; while the next time you can simply decide to do the same thing you did yesterday. b) Because they didn’t have positive memories connected with exercising in past, so while their heads are thinking that it would be good to exercise and become more fit and healthy, their hearts try to avoid the whole thing.
If this model is correct (well, that’s questionable, but I suppose it is) the next time there is an advantage that you can follow the strategy of doing the same thing as the last time, and you already have some positive memories. And this could be enough for some people to change the balance. And may be not enough for others. In this specific case, we will later have experimental data.
Speaking for myself, many people I know who exercise or do sport regularly, do it with their friends. If those were my friends, I would be also tempted to join. But I am rather picky about choosing my friends. And the people who pass my filter are usually just as lazy as I am, or too individualistic do agree on doing something together. A few times I went to gym, it was incredibly boring. (I imagine having there someone to talk with would change that. Or if I would just remember to always bring a music player, perhaps with an audio book.) I do some small exercise at home. I imagine that if I had an exercise machine at home, I would use it, because the largest inconvenience for me is to go somewhere outside.
That would be obviously wrong, I agree. I just don’t expect this to happen. But it is better to mention it explicitly.
Citation needed.
Not to mention that what an average person can or can not do isn’t particularly illuminating for non-representative subsets like LW.
I am not sure that is possible. Instrumental rationality is just making sure that what you are doing is useful in getting to wherever you want to go. What does “severely overboard” mean in this context?
Read Dan Kahneman’s work. He’s spent his entire lifetime studying this and won a nobel prize for it too. A good summary is given in http://www.newyorker.com/tech/frontal-cortex/why-smart-people-are-stupid Here’s an excerpt:
‘as the scientists note, “people who were aware of their own biases were not better able to overcome them.” This finding wouldn’t surprise Kahneman, who admits in “Thinking, Fast and Slow” that his decades of groundbreaking research have failed to significantly improve his own mental performance. “My intuitive thinking is just as prone to overconfidence, extreme predictions, and the planning fallacy”—a tendency to underestimate how long it will take to complete a task—“as it was before I made a study of these issues,” he writes. ’
In fact it is; there is no substantial difference when it comes to trying to control biases between highly educated and non-educated people.
There is nothing wrong with ‘making sure that what you are doing is useful in getting to wherever you want to go’. The problem is the idea of trying to ‘fix’ your behavior through self-imposed procedures, trial & error, and self-reporting. Experience shows that this often backfires, as I said. It’s pretty amazing that “I tried method X, and it seemed to work well, I suggest you try it!” (look at JohnMaxwellIV’s comment below for just one example) is taken as constructive information on a site dedicated to rationality.
First, rationality is considerably more than just adjusting for biases.
Second, in your quote Kahneman says (emphasis mine): “My intuitive thinking is just as prone...”. The point isn’t that your System 1 changes much, the point is that your System 2 knows what to look for and compensates as best as it can.
Sigh. Citation needed.
And what it the problem, exactly? I am also not sure what the alternative is. Do you want to just assume your own behaviour is immutable? Magically determined without you being able to do anything about it? Do you think you need someone else to change your behaviour for you? What?
Disagree. See comments in http://lesswrong.com/lw/d1u/the_new_yorker_article_on_cognitive_biases/
I’m not talking about the bias blind spot. I agree that more educated people are better able to discern biases in their own thoughts and others. In fact that’s exactly what I said, not once but two times.
I’m talking about the ability to control one’s own biases.
Are you distinguishing between “control one’s own biases” and “adjusting and compensating for one’s own biases”?
Huh? So what are more intelligence—and more educated—people doing, exactly, if not controlling their biases?
Can confirm. OMG LOL.