It sounds like you’re talking about informational entropy, and the model you’re describing sounds -very- similar to a neural net, which uses something like entropy to arrive at conclusions. Have you investigated either of these topics, and/or am I misreading you?
I am talking about informational entropy, but using the analogy to pressure as a intuition pump. This applies whether we are using neutral nets or other PGMs. In fact, I was thinking of causal BNs as ideal approximations of human rationality, and noting the additional fact that there seems to be some cost to maintaining uncertainty that seems to fit the earlier analogy. (Sorry if I’m unclear—I will reread and try to clarify when I’m not on my phone.)
It sounds like you’re talking about informational entropy, and the model you’re describing sounds -very- similar to a neural net, which uses something like entropy to arrive at conclusions. Have you investigated either of these topics, and/or am I misreading you?
I am talking about informational entropy, but using the analogy to pressure as a intuition pump. This applies whether we are using neutral nets or other PGMs. In fact, I was thinking of causal BNs as ideal approximations of human rationality, and noting the additional fact that there seems to be some cost to maintaining uncertainty that seems to fit the earlier analogy. (Sorry if I’m unclear—I will reread and try to clarify when I’m not on my phone.)