It’s of course up for debate in each situation how to weigh the short- and long-term results when they’re different.
So I’m not a Wikipedia editor or otherwise familiar enough with its politics to know how influential or canonical these essays are, but it has two articles that look like they’ve been around for quite a while and seem to basically be saying “sometimes it’s better for trusted Wikipedia admins to cover up information from other people”. I’d guess that the fact that these pages have been around for a while is some evidence for such policies at least sometimes working in the long term.
The little boy’s mother was going off to the market. She worried about her son, who was always up to some mischief. She sternly admonished him, “Be good. Don’t get into trouble. Don’t eat all the chocolate. Don’t spill all the milk. Don’t throw stones at the cow. Don’t fall down the well.” The boy had done all of these things on previous market days. Hoping to head off new trouble, she added, “And don’t stuff beans up your nose!” This was a new idea for the boy, who promptly tried it out.
In our zeal to head off others’ unwise actions, we may put forth ideas they have not entertained before. As the popular saying goes, “don’t give ’em any ideas”.
For example, don’t give potential vandals examples of how to cause disruption. This may will tempt them to do it.[Note 1]
In a similar vein, there are many areas of the encyclopedia that rely on, or benefit from, some level of security through obscurity, such as WP:SPI. For this reason, specific cases and abuse mitigation are often left undiscussed on-wiki, and this essay is sometimes cited in such situations (often using the shortcut WP:BEANS) to drop the hint that further public explanation of a matter could be unwise. An essay explaining this in more detail is Wikipedia:There’s a reason you don’t know.
Certain things on Wikipedia happen with little or no explanation. These include suppressions, checkuser blocks, many revision deletions, and many actions of the Arbitration Committee. Because most actions are logged publicly on Wikipedia, and their rationales well-documented, some users may get upset that they don’t know why these things happen.
But things are only obscured for a small number of reasons. Sometimes they concern private information that, for legal reasons, only certain users can access. Sometimes excessive detail would help bad actors do more bad things or reward malfeasance with attention. And sometimes it’s for the best interests of the editors affected, such as minors or those experiencing mental health issues. If you don’t know why something happened, there’s probably a reason. And it’s probably a good reason. And by butting in without knowing full context, you could cause serious harm.
If you have concerns about the reasoning for something, there are procedures for questioning a decision without publicizing private information. Assume Good Faith applies to opaque actions too. If, say, an admin blocked someone without stating a reason, and this concerns you, usually the best course of action will be to email them and politely ask why. If you are unable to resolve your concerns this way, the Arbitration Committee is empowered to resolve any disputes involving private evidence. On a global level, the Ombuds commission also can hear complaints involving alleged misuse of checkuser and oversight privileges.
Don’t be the jerk who prolongs someone’s mental anguish, or gives attention to a long-term abuser, or helps a vandal evade their block, just because you were curious or leapt to the conclusion of admin abuse. Follow the proper channels to inquire about something or appeal a decision, and assume good faith of the user who made it.
Also, if we consider the category of anticipatory cover-ups to include things like “company trade secrets or information that is classified for national security reasons” then those kinds of policies have also been around for a long time.
It’s worth considering, in each of the examples in the post, how different it might have been if the policy had been “withhold the object-level data, but be explicit and forthright about the fact that you’re hiding it and the reasons for doing so”.
First time I’d seen There’s A Reason You Don’t Know. I’d be fascinated to hear how well that works for them, because oh boy does it not really work some groups. (It’s rationalists, we’re some groups)
So I’m not a Wikipedia editor or otherwise familiar enough with its politics to know how influential or canonical these essays are, but it has two articles that look like they’ve been around for quite a while and seem to basically be saying “sometimes it’s better for trusted Wikipedia admins to cover up information from other people”. I’d guess that the fact that these pages have been around for a while is some evidence for such policies at least sometimes working in the long term.
Wikipedia:Don’t stuff beans up your nose
Wikipedia:There’s a reason you don’t know
Also, if we consider the category of anticipatory cover-ups to include things like “company trade secrets or information that is classified for national security reasons” then those kinds of policies have also been around for a long time.
It’s worth considering, in each of the examples in the post, how different it might have been if the policy had been “withhold the object-level data, but be explicit and forthright about the fact that you’re hiding it and the reasons for doing so”.
First time I’d seen There’s A Reason You Don’t Know. I’d be fascinated to hear how well that works for them, because oh boy does it not really work some groups. (It’s rationalists, we’re some groups)