The Perils of the Security Mindset taken too far

Epistemic status: A few initial thoughts.

For your project to be secure no one should know of your project’s existence. Or better than that your project should show all the outwards signs of one project but be another.

To be secure, your inner workings should be opaque, so you become less predictable. Therefore people trust you less. In Newcomb’s problem one of the common strategies people come up with to trick omega is to use quantum sources of information to become less predictable. The common counter to this is that Newcomb only fills both boxes in the case it can predict you.

If you are opaque, it is not known who you associate with. Even if people trust you, they might not trust that you would not associate with people they do not trust.

If you are opaque, your security model is not known. Even if people trust you not to leak things on purpose, they might not trust you to do so accidentally.

You trust people less than is optimal. There are false negatives to your decisions to trust people.

You degrade the rationality of other people. There are at least two things that mean people aren’t as rational as they could be, a lack of brain power and a lack of information. Hiding information means people can be a lot less rational or effective. This cost is borne by everyone else but you, so you might not be accounting for it.

Hiding your strategic views, hides your flaws. No one can know if you are being too paranoid, because you hide your threat model.

Brain power is hoovered up trying to model other people modelling you, to make sure that you don’t tip your hand.

If you can possibly avoid taking the security mindset this far, do so. Do all you can in the freedom of openness. Secrecy can also be a mindkiller.