Daniel Ellsberg on the corrosive effect of knowing secret information
First, [you feel] a great exhilaration, for getting all these amazing information that you didn’t know even existed. And the next phase is you’ll feel like a fool for not having known of any of this. But that won’t last long. Very soon, you’ll come to think that everyone else is foolish. What would this expert be telling me if he knew what I knew? So in the end, you stop listening too.
I’ve never had a security clearance. But I have had private or secret rationalist information, and it’s shocking how fast it corrupts my epistemics. Claiming to want comparative advantage I overweight the importance of the information. I flinch from imperiling my access to more secret information. Talking about a related area without divulging private information feels too hard so I don’t bother.
I can fight all these effects, but that’s a cost, and one I’ve learned to bill to the goal or person behind the secret information.
Hmm. One of the most emotionally (?) difficult parts of deciding to leave the govt for me was knowing I’d give up my security clearance even though I’d already inferred or known ahead of time every supposedly ‘secret’ thing I’d been shown! (My clearance was not especially high level.)
This was a new kind of flinch to experience, but I didn’t notice or reflect on the psychological generalisations and neighbours it might have at the time. Thanks for the nudge.
Possible corollary: if you want to communicate with someone who might have private information, make clear your guesses about what might not be known. Their private info is a test set for your ability to forecast. Only relevant when that guessing might actually be relevant and plausibly pro-social, and maybe this whole suggestion is a universally bad idea. But I can think of scenarios where one knows someone has private info they won’t share and you’re trying to make predictive claims, and showing you have any ability at all to guess vaguely right from public info would sure help.
You might also ruin your reputation by accidentally guessing only the secret info that is not available to them.
Imagine that there are alien spaceships in both Area 51 and Area 52. Your friend only has security clearance to know about Area 52. You only figure out the information about Area 51. After telling your friend, they will update towards you being wrong.
It may not be a very good test, in many cases. Perhaps modifying it to gauge confidence could be better?
One can imagine there are near infinite sets of things that might be true for whatever the secret knowledge is regarding. Only a subset actually is a true. What is the most probable prior may be wildly divergent from what is actually true. If you judge purely on how they confirm to your secret set that doesn’t tell you how good they are at forecasting in general, just that they happen to be wrong on that set.
If you gauge confidence, that might be better. If they are very confident about something you know to be wrong, it is unlikely that the prior probability lined up with reality. If they are only moderately confident, or believe it best explains the evidence they have but are fully aware it may be incomplete or not explain other evidence they lack, then it seems unreasonable to strongly hold a view based on them.
Daniel Ellsberg on the corrosive effect of knowing secret information
I’ve never had a security clearance. But I have had private or secret rationalist information, and it’s shocking how fast it corrupts my epistemics. Claiming to want comparative advantage I overweight the importance of the information. I flinch from imperiling my access to more secret information. Talking about a related area without divulging private information feels too hard so I don’t bother.
I can fight all these effects, but that’s a cost, and one I’ve learned to bill to the goal or person behind the secret information.
Hmm. One of the most emotionally (?) difficult parts of deciding to leave the govt for me was knowing I’d give up my security clearance even though I’d already inferred or known ahead of time every supposedly ‘secret’ thing I’d been shown! (My clearance was not especially high level.)
This was a new kind of flinch to experience, but I didn’t notice or reflect on the psychological generalisations and neighbours it might have at the time. Thanks for the nudge.
previous discussion:
https://www.lesswrong.com/posts/Jf3ECowLsygYYhEC2/shortform-3?commentId=YqRhTQTHndkr2hGnh
https://www.lesswrong.com/posts/cxuzALcmucCndYv4a/daniel-kokotajlo-s-shortform?commentId=8sSvtQZYDNnC9bhvn
Possible corollary: if you want to communicate with someone who might have private information, make clear your guesses about what might not be known. Their private info is a test set for your ability to forecast. Only relevant when that guessing might actually be relevant and plausibly pro-social, and maybe this whole suggestion is a universally bad idea. But I can think of scenarios where one knows someone has private info they won’t share and you’re trying to make predictive claims, and showing you have any ability at all to guess vaguely right from public info would sure help.
You might also ruin your reputation by accidentally guessing only the secret info that is not available to them.
Imagine that there are alien spaceships in both Area 51 and Area 52. Your friend only has security clearance to know about Area 52. You only figure out the information about Area 51. After telling your friend, they will update towards you being wrong.
It may not be a very good test, in many cases. Perhaps modifying it to gauge confidence could be better?
One can imagine there are near infinite sets of things that might be true for whatever the secret knowledge is regarding. Only a subset actually is a true. What is the most probable prior may be wildly divergent from what is actually true. If you judge purely on how they confirm to your secret set that doesn’t tell you how good they are at forecasting in general, just that they happen to be wrong on that set.
If you gauge confidence, that might be better. If they are very confident about something you know to be wrong, it is unlikely that the prior probability lined up with reality. If they are only moderately confident, or believe it best explains the evidence they have but are fully aware it may be incomplete or not explain other evidence they lack, then it seems unreasonable to strongly hold a view based on them.