I think all of these indicate insufficient strategic thinking on behalf of the information-controlling party.
If you think your opponent will take statistics out of context, then it makes sense to try to keep those statistics hidden
No. Only if you ALSO think the statistics will stay hidden, and also somehow think that the hiding can be done well enough that it won’t be worse than the data. These are patently false in most cases. It is both a moral and a strategic failure to attempt this.
It ends up going well. … She would like to avoid this kind of problem in the future
This would benefit from a clear problem statement. And there are plenty of avenues for Alice to be more deeply involved in the prep which are NOT simply feedback that this was uncomfortable. Ask for the outline, the prelims, offer to help early. Alternately, if this is something that can be done well in a day, don’t give him weeks to do it.
A small group (subculture, religious sect, police force, etc.) within broader society sees one of their own doing something bad, such as engaging in abuse.
This is tricker, especially for groups already facing persecution. Probably the correct reaction is expulsion from the group and letting the bigger society handle the individual as a no-longer-supported-by-group individual.
This led to a dynamic where I became reluctant to acknowledge her having any correct insights into me that I had initially disagreed with. … we’re no longer friends.
You could have saved some time by skipping to the end, perhaps by confronting/discussing with her that you hate it when she seems to be manipulating you, even if she thinks it’s for your benefit.
“if you are withholding information because of how you expect the other party to react, be aware that this might just make everything worse”.
I’d be stronger in my recommendation.
In all of these cases, cover-up is a short-term attempt to avoid a conflict, which cannot work for very long, and impedes actual shared knowledge and discussion about any underlying disagreement. It’s almost always a mistake which reduces overall trust and erodes future ability to resolve disagreements.
Note: I admit that do this myself sometimes—I hate direct conflict, and for low-stakes things it often does work by allowing the topic pass by without having to deal with it. I’m not saying “this is always wrong”, I’m saying “It’s usually wrong for high-stakes topics that won’t go away until addressed”. I endorse lying and evasion in many situations, but only as a last resort when it’s clear that other communication mechanisms are unworkable.
Only if you ALSO think the statistics will stay hidden, and also somehow think that the hiding can be done well enough that it won’t be worse than the data. These are patently false in most cases. It is both a moral and a strategic failure to attempt this.
There’s an obvious selection bias in that we hear about the cases where people fail to cover up something, and don’t hear about the cases where they succeed. (The bit where the anti-vaccine group filed a FoIA lawsuit to compel release of internal documents is a good example—I’d expect that in the vast majority of cases, an organization’s internal files merely remain internal, without any court compelling them to be released.)
I would like to believe that the truth always comes out, so one might as well always be transparent and tell the truth. But that belief feels like it would be a little too convenient, and I don’t know of a strong reason to believe that it’s true.
You could have saved some time by skipping to the end, perhaps by confronting/discussing with her that you hate it when she seems to be manipulating you, even if she thinks it’s for your benefit.
Sometimes confronting the person or leaving the relationship work. Sometimes confronting just makes them more hostile (was true in this case), and/or there are reasons why it’s difficult or impractical to leave.
Acknowledged that it’s often a bit less obvious what will work or not work than I wish. I do think it’s often a matter of timeframes, though—tactics that may work in the short term are usually different from what’s durable. It’s of course up for debate in each situation how to weigh the short- and long-term results when they’re different.
One thing I should mention—I often frame this in terms of “is it in my interest to make this private knowledge into common knowledge (as in I get to know that you know, and you know I know you know etc.)” In many examples, this simplifies the conflict to actual points of disagreement over values, not on private estimates of a fact (whether that fact is a government statistic, a belief about a friend’s agency, or a preference in work style).
It’s of course up for debate in each situation how to weigh the short- and long-term results when they’re different.
So I’m not a Wikipedia editor or otherwise familiar enough with its politics to know how influential or canonical these essays are, but it has two articles that look like they’ve been around for quite a while and seem to basically be saying “sometimes it’s better for trusted Wikipedia admins to cover up information from other people”. I’d guess that the fact that these pages have been around for a while is some evidence for such policies at least sometimes working in the long term.
The little boy’s mother was going off to the market. She worried about her son, who was always up to some mischief. She sternly admonished him, “Be good. Don’t get into trouble. Don’t eat all the chocolate. Don’t spill all the milk. Don’t throw stones at the cow. Don’t fall down the well.” The boy had done all of these things on previous market days. Hoping to head off new trouble, she added, “And don’t stuff beans up your nose!” This was a new idea for the boy, who promptly tried it out.
In our zeal to head off others’ unwise actions, we may put forth ideas they have not entertained before. As the popular saying goes, “don’t give ’em any ideas”.
For example, don’t give potential vandals examples of how to cause disruption. This may will tempt them to do it.[Note 1]
In a similar vein, there are many areas of the encyclopedia that rely on, or benefit from, some level of security through obscurity, such as WP:SPI. For this reason, specific cases and abuse mitigation are often left undiscussed on-wiki, and this essay is sometimes cited in such situations (often using the shortcut WP:BEANS) to drop the hint that further public explanation of a matter could be unwise. An essay explaining this in more detail is Wikipedia:There’s a reason you don’t know.
Certain things on Wikipedia happen with little or no explanation. These include suppressions, checkuser blocks, many revision deletions, and many actions of the Arbitration Committee. Because most actions are logged publicly on Wikipedia, and their rationales well-documented, some users may get upset that they don’t know why these things happen.
But things are only obscured for a small number of reasons. Sometimes they concern private information that, for legal reasons, only certain users can access. Sometimes excessive detail would help bad actors do more bad things or reward malfeasance with attention. And sometimes it’s for the best interests of the editors affected, such as minors or those experiencing mental health issues. If you don’t know why something happened, there’s probably a reason. And it’s probably a good reason. And by butting in without knowing full context, you could cause serious harm.
If you have concerns about the reasoning for something, there are procedures for questioning a decision without publicizing private information. Assume Good Faith applies to opaque actions too. If, say, an admin blocked someone without stating a reason, and this concerns you, usually the best course of action will be to email them and politely ask why. If you are unable to resolve your concerns this way, the Arbitration Committee is empowered to resolve any disputes involving private evidence. On a global level, the Ombuds commission also can hear complaints involving alleged misuse of checkuser and oversight privileges.
Don’t be the jerk who prolongs someone’s mental anguish, or gives attention to a long-term abuser, or helps a vandal evade their block, just because you were curious or leapt to the conclusion of admin abuse. Follow the proper channels to inquire about something or appeal a decision, and assume good faith of the user who made it.
Also, if we consider the category of anticipatory cover-ups to include things like “company trade secrets or information that is classified for national security reasons” then those kinds of policies have also been around for a long time.
It’s worth considering, in each of the examples in the post, how different it might have been if the policy had been “withhold the object-level data, but be explicit and forthright about the fact that you’re hiding it and the reasons for doing so”.
First time I’d seen There’s A Reason You Don’t Know. I’d be fascinated to hear how well that works for them, because oh boy does it not really work some groups. (It’s rationalists, we’re some groups)
I think all of these indicate insufficient strategic thinking on behalf of the information-controlling party.
No. Only if you ALSO think the statistics will stay hidden, and also somehow think that the hiding can be done well enough that it won’t be worse than the data. These are patently false in most cases. It is both a moral and a strategic failure to attempt this.
This would benefit from a clear problem statement. And there are plenty of avenues for Alice to be more deeply involved in the prep which are NOT simply feedback that this was uncomfortable. Ask for the outline, the prelims, offer to help early. Alternately, if this is something that can be done well in a day, don’t give him weeks to do it.
This is tricker, especially for groups already facing persecution. Probably the correct reaction is expulsion from the group and letting the bigger society handle the individual as a no-longer-supported-by-group individual.
You could have saved some time by skipping to the end, perhaps by confronting/discussing with her that you hate it when she seems to be manipulating you, even if she thinks it’s for your benefit.
I’d be stronger in my recommendation.
In all of these cases, cover-up is a short-term attempt to avoid a conflict, which cannot work for very long, and impedes actual shared knowledge and discussion about any underlying disagreement. It’s almost always a mistake which reduces overall trust and erodes future ability to resolve disagreements.
Note: I admit that do this myself sometimes—I hate direct conflict, and for low-stakes things it often does work by allowing the topic pass by without having to deal with it. I’m not saying “this is always wrong”, I’m saying “It’s usually wrong for high-stakes topics that won’t go away until addressed”. I endorse lying and evasion in many situations, but only as a last resort when it’s clear that other communication mechanisms are unworkable.
There’s an obvious selection bias in that we hear about the cases where people fail to cover up something, and don’t hear about the cases where they succeed. (The bit where the anti-vaccine group filed a FoIA lawsuit to compel release of internal documents is a good example—I’d expect that in the vast majority of cases, an organization’s internal files merely remain internal, without any court compelling them to be released.)
I would like to believe that the truth always comes out, so one might as well always be transparent and tell the truth. But that belief feels like it would be a little too convenient, and I don’t know of a strong reason to believe that it’s true.
Sometimes confronting the person or leaving the relationship work. Sometimes confronting just makes them more hostile (was true in this case), and/or there are reasons why it’s difficult or impractical to leave.
Acknowledged that it’s often a bit less obvious what will work or not work than I wish. I do think it’s often a matter of timeframes, though—tactics that may work in the short term are usually different from what’s durable. It’s of course up for debate in each situation how to weigh the short- and long-term results when they’re different.
One thing I should mention—I often frame this in terms of “is it in my interest to make this private knowledge into common knowledge (as in I get to know that you know, and you know I know you know etc.)” In many examples, this simplifies the conflict to actual points of disagreement over values, not on private estimates of a fact (whether that fact is a government statistic, a belief about a friend’s agency, or a preference in work style).
So I’m not a Wikipedia editor or otherwise familiar enough with its politics to know how influential or canonical these essays are, but it has two articles that look like they’ve been around for quite a while and seem to basically be saying “sometimes it’s better for trusted Wikipedia admins to cover up information from other people”. I’d guess that the fact that these pages have been around for a while is some evidence for such policies at least sometimes working in the long term.
Wikipedia:Don’t stuff beans up your nose
Wikipedia:There’s a reason you don’t know
Also, if we consider the category of anticipatory cover-ups to include things like “company trade secrets or information that is classified for national security reasons” then those kinds of policies have also been around for a long time.
It’s worth considering, in each of the examples in the post, how different it might have been if the policy had been “withhold the object-level data, but be explicit and forthright about the fact that you’re hiding it and the reasons for doing so”.
First time I’d seen There’s A Reason You Don’t Know. I’d be fascinated to hear how well that works for them, because oh boy does it not really work some groups. (It’s rationalists, we’re some groups)