Really interesting post. To me, approaching information with mathematics seems like a black box—and in this post, it feels like magic.
I’m a little confused by the concept of cost: I understand that it takes more data to represent more complex systems, which grows exponentially faster than than the amount of bits. But doesn’t the more complex model still strictly fit the data better? - is it just trying to go for a different goal than accuracy? I feel like I’m missing the entire point of the end.
I really agree with this. I have been thinking that we should “default to privacy”, because if we think we have to share it, we will change our thoughts because of the social anxieties/pressures. (It’s similar to that experiment that demonstrated people make better decisions if they didn’t have to come to a solution first (I just remember this from reading HP:MOR).) Only after we reach the answer, (socially) unbiased, then we can decide to share it.
I don’t think privacy means dishonesty. I personally really dislike lying, and I think it’s because acting with false information sort of takes away their free will, and more practically, this creates a lot of uncertainty. But I think you can be honest about how you withhold information, to an extent: instead of lying, you can just say, “I won’t tell you” or something like that. (I’m not sure how much that is based on the practicality of it and how much is it is a like a terminal value.)
I’m sort of confused by radical honesty. Is it really, truly, “radical”? Literally everyone has intrusive thoughts, and I personally sometimes have intrusive thoughts about raping or killing or saying racial slurs. I guess that’s just a nitpick, because I can easily see how to be “maximally” honest (compared to normal communication).