This advice bothers me a lot. Labeling possibly true knowledge as dangerous knowledge (as the example with statements about average behavior of groups) is deeply worrisome and is the sort of thing that if one isn’t careful would be used by people to justify ignoring relevant data about reality. I’m also concerned that this piece conflates actual knowledge (as in empirical data) and things like group identity which seems to be not so much knowledge but rather a value association.
I am grouping together “everything that goes into your brain,” which includes lots and lots of stuff, most of it unconscious. See research on priming), for example.
This argument is explicitly about encouraging people to justify ignoring relevant data about reality. It is, I recognize, an extremely dangerous proposition, of exactly the sort I am warning against!
At risk of making a fully general counterargument, I think it’s telling that a number of commenters, yourself included, have all but said that this post is too dangerous.
Emile thinks it’s Dark Side Epistemology. (And see my response.)
These are not just people dismissing this as a bad idea (which would have encouraged me to do the same), these are people are worrying about a dangerous idea. I’m more convinced I’m right than I was when I wrote the post.
Heh. So most of the critics argue their disapproval of the argument in your post based essentially on the same considerations as discussed in the post.
It doesn’t make you right. It just makes them as wrong (or lazy) as you.
If you feel afraid that incorporating a belief would change your values, that’s fine. It’s understandable that you won’t then dispassionately weigh the evidence for it; perhaps you’ll bring a motivated skepticism to bear on the scary belief. If it’s important enough that you care, then the effort is justified.
However, fighting to protect your cherished belief is going to lead to a biased evaluation of evidence, so refusing to engage the scary arguments is just a more extreme and honest version of trying to refute them.
I’d justify both practices situationally: considering the chance you weigh the evidence dispassionately but get the answer quite wrong (even your confidence estimation is off), you can err on the side of caution in protecting your most cherished values. That is, your objective function isn’t just to have the best Bayesian-rational track record.
These are not just people dismissing this as a bad idea (which would have encouraged me to do the same), these are people are worrying about a dangerous idea. I’m more convinced I’m right than I was when I wrote the post.
Becoming more convinced of your own position when presented with counterarguments is a well known cognitive bias.
Knowing about biases may have hurt you. The counterarguments are not what convinced me; it’s that the counterarguments describe my post as bad because it belongs to the class of things that it is warning against.
There are other counterarguments in the comments here that have made me less convinced of my position; this is not a belief of which I am substantially certain.
“Deeply worrisome” may have been bad wording on my part. It might be more accurate to say that this is an attitude which is so much more often wrong than right that it is better to acknowledge the low probability of such knowledge existing but not actually deliberately keep knowledge out.
This advice bothers me a lot. Labeling possibly true knowledge as dangerous knowledge (as the example with statements about average behavior of groups) is deeply worrisome and is the sort of thing that if one isn’t careful would be used by people to justify ignoring relevant data about reality. I’m also concerned that this piece conflates actual knowledge (as in empirical data) and things like group identity which seems to be not so much knowledge but rather a value association.
I am grouping together “everything that goes into your brain,” which includes lots and lots of stuff, most of it unconscious. See research on priming), for example.
This argument is explicitly about encouraging people to justify ignoring relevant data about reality. It is, I recognize, an extremely dangerous proposition, of exactly the sort I am warning against!
At risk of making a fully general counterargument, I think it’s telling that a number of commenters, yourself included, have all but said that this post is too dangerous.
You called it “deeply worrisome.”
RichardKennaway called it “defeatist scaremongering.”
Emile thinks it’s Dark Side Epistemology. (And see my response.)
These are not just people dismissing this as a bad idea (which would have encouraged me to do the same), these are people are worrying about a dangerous idea. I’m more convinced I’m right than I was when I wrote the post.
Heh. So most of the critics argue their disapproval of the argument in your post based essentially on the same considerations as discussed in the post.
It doesn’t make you right. It just makes them as wrong (or lazy) as you.
If you feel afraid that incorporating a belief would change your values, that’s fine. It’s understandable that you won’t then dispassionately weigh the evidence for it; perhaps you’ll bring a motivated skepticism to bear on the scary belief. If it’s important enough that you care, then the effort is justified.
However, fighting to protect your cherished belief is going to lead to a biased evaluation of evidence, so refusing to engage the scary arguments is just a more extreme and honest version of trying to refute them.
I’d justify both practices situationally: considering the chance you weigh the evidence dispassionately but get the answer quite wrong (even your confidence estimation is off), you can err on the side of caution in protecting your most cherished values. That is, your objective function isn’t just to have the best Bayesian-rational track record.
Your post is not dangerous knowledge. It’s dangerous advice about dangerous knowledge.
Becoming more convinced of your own position when presented with counterarguments is a well known cognitive bias.
Knowing about biases may have hurt you. The counterarguments are not what convinced me; it’s that the counterarguments describe my post as bad because it belongs to the class of things that it is warning against.
There are other counterarguments in the comments here that have made me less convinced of my position; this is not a belief of which I am substantially certain.
“Deeply worrisome” may have been bad wording on my part. It might be more accurate to say that this is an attitude which is so much more often wrong than right that it is better to acknowledge the low probability of such knowledge existing but not actually deliberately keep knowledge out.