Your question rests on an assumption that obscurantism must decrease information, but I see that assumption as incorrect. In fact, under this assumption I should never regard anything said to me as obscurantist, as it should never decrease the amount of information available to me.
Wikipedia defines “obscurantism” as “the practice of deliberately preventing the facts or the full details of some matter from becoming known”, and it seems to fit the bill. Of course, it may be useful or beneficial species of obscurantism, though I agree with Prismatic that it is not.
The situation as you describe it seems pre-biased by postulating that the mainstream view is dubious. This may be obvious to you, but to me, the person who’s faced with the “hints” as described, it is not—if it were, I shouldn’t need the hints to begin with. I think it’s incorrect to condition on the dubiousness of the mainstream view. If I am to decide on how to best to take into account hints of that nature, the possibility that the mainstream view is correct after all, and the hint entirely specious, should not be disregarded. In fact, in real-life situations where such hints are offered, this may be the more frequent scenario.
The hint that says “this view is incorrect, but I will not explain why, for doing that will violate a social norm” is annoying and distracting; it engages my attention, bringing no real evidence for its claims. Because it posits a mystery, I’m likely to err on the side of giving it more attention than it deserves. The benefit is that it may cause me to investigate the view more thoroughly than I would otherwise have, and realize it is incorrect. If I precommit to ignoring such signals, I will miss some chances of that, and I will also avoid giving my attention, and more closely investigating, all those views that are correct after all, and where the signal was specious. The bargain may well be worth it.
Your question rests on an assumption that obscurantism must decrease information, but I see that assumption as incorrect. In fact, under this assumption I should never regard anything said to me as obscurantist, as it should never decrease the amount of information available to me.
What makes obscurantism a relevant category is that certain ways of withholding information and intentional abstruseness can be very effective for misleading people and producing convictions without evidence. In LW parlance, it is a particular kind of Dark Arts. Now, of course, it makes no sense to debate definitions when there is a true disagreement about them, but I think it shouldn’t be controversial to insist that the normal meaning of “obscurantism” involves this Dark Arts element. In other words, it involves withholding information with the intent to mislead and produce mistaken or unsubstantiated beliefs, and it cannot be applied to every act of withholding information intentionally.
I do think the Wikipedia definition you quoted is unreasonably overbroad, considering the standard usage of the word. It would cover all sorts of completely honest, reasonable, and non-misleading acts of communication where one chooses to limit the amount of information given—for example, saying that you got a new job but not disclosing the salary, or writing blog comments under a pseudonym.
If I am to decide on how to best to take into account hints of that nature, the possibility that the mainstream view is correct after all, and the hint entirely specious, should not be disregarded. [...] The hint that says “this view is incorrect, but I will not explain why, for doing that will violate a social norm” is annoying and distracting; it engages my attention, bringing no real evidence for its claims.
It is not true that it brings no significant evidence, if the source of the hint is someone about whom you have other information—and information about the intellectual abilities, knowledge, and likely biases of frequent commenters is easy to get in a forum like this one (if you don’t in fact have it already). And you can always simply ignore such hits if you believe you have insufficient information, or you don’t feel like looking for it, the way you presumably ignore any other comments that are not of interest to you.
Also, I note that your complaint here doesn’t state that these hits are misleading and apt to trigger biases leading to incorrect beliefs, so you must indeed be working with the broadest possible (and I would say overbroad) definition of “obscurantism.”
If I precommit to ignoring such signals, I will miss some chances of that, and I will also avoid giving my attention, and more closely investigating, all those views that are correct after all, and where the signal was specious. The bargain may well be worth it.
It may indeed—but why precommit unconditionally, without considering the source of these signals?
Nothing would ever be obscurantist for a perfectly rational mind that correctly evaluates every sensory input according to whatever evidence it provides for any logically possible hypothesis.
Not technically true. It is possible to make a perfectly rational mind produce worse predictions about the world by providing it with selected information. This relies on it having insufficient information about your obscuring tendencies or motives. The new probabilities that the rational agent has will necessarily be a subjectively objective improvement but can still produce worse predictions of the relevant aspects of the world in an objective sense.
Your question rests on an assumption that obscurantism must decrease information, but I see that assumption as incorrect. In fact, under this assumption I should never regard anything said to me as obscurantist, as it should never decrease the amount of information available to me.
Wikipedia defines “obscurantism” as “the practice of deliberately preventing the facts or the full details of some matter from becoming known”, and it seems to fit the bill. Of course, it may be useful or beneficial species of obscurantism, though I agree with Prismatic that it is not.
The situation as you describe it seems pre-biased by postulating that the mainstream view is dubious. This may be obvious to you, but to me, the person who’s faced with the “hints” as described, it is not—if it were, I shouldn’t need the hints to begin with. I think it’s incorrect to condition on the dubiousness of the mainstream view. If I am to decide on how to best to take into account hints of that nature, the possibility that the mainstream view is correct after all, and the hint entirely specious, should not be disregarded. In fact, in real-life situations where such hints are offered, this may be the more frequent scenario.
The hint that says “this view is incorrect, but I will not explain why, for doing that will violate a social norm” is annoying and distracting; it engages my attention, bringing no real evidence for its claims. Because it posits a mystery, I’m likely to err on the side of giving it more attention than it deserves. The benefit is that it may cause me to investigate the view more thoroughly than I would otherwise have, and realize it is incorrect. If I precommit to ignoring such signals, I will miss some chances of that, and I will also avoid giving my attention, and more closely investigating, all those views that are correct after all, and where the signal was specious. The bargain may well be worth it.
What makes obscurantism a relevant category is that certain ways of withholding information and intentional abstruseness can be very effective for misleading people and producing convictions without evidence. In LW parlance, it is a particular kind of Dark Arts. Now, of course, it makes no sense to debate definitions when there is a true disagreement about them, but I think it shouldn’t be controversial to insist that the normal meaning of “obscurantism” involves this Dark Arts element. In other words, it involves withholding information with the intent to mislead and produce mistaken or unsubstantiated beliefs, and it cannot be applied to every act of withholding information intentionally.
I do think the Wikipedia definition you quoted is unreasonably overbroad, considering the standard usage of the word. It would cover all sorts of completely honest, reasonable, and non-misleading acts of communication where one chooses to limit the amount of information given—for example, saying that you got a new job but not disclosing the salary, or writing blog comments under a pseudonym.
It is not true that it brings no significant evidence, if the source of the hint is someone about whom you have other information—and information about the intellectual abilities, knowledge, and likely biases of frequent commenters is easy to get in a forum like this one (if you don’t in fact have it already). And you can always simply ignore such hits if you believe you have insufficient information, or you don’t feel like looking for it, the way you presumably ignore any other comments that are not of interest to you.
Also, I note that your complaint here doesn’t state that these hits are misleading and apt to trigger biases leading to incorrect beliefs, so you must indeed be working with the broadest possible (and I would say overbroad) definition of “obscurantism.”
It may indeed—but why precommit unconditionally, without considering the source of these signals?
Not technically true. It is possible to make a perfectly rational mind produce worse predictions about the world by providing it with selected information. This relies on it having insufficient information about your obscuring tendencies or motives. The new probabilities that the rational agent has will necessarily be a subjectively objective improvement but can still produce worse predictions of the relevant aspects of the world in an objective sense.
You’re right, of course. I edited away that part, which is not relevant for the main point anyway.