my guess is that they would not share his assessment that the explanation that has been provided by T3t above is completely inadequate
This worries me because of double illusion of transparency concerns. That is, one frame we could have here is that Said is virtuously refusing to pretend to understand anything he doesn’t understand. Suppose the version of “authentic” that is necessary to make this post work is actually quite detailed and nuanced, in ways that T3t’s guess don’t quite get at; then it seems like T3t and I might mistakenly believe that communication has taken place when it actually hasn’t, whereas Said and I will have no such illusions.
If there are problems with this situation, I think they come from differing people having different expectations of how bad it is to not have communicated something to Said, and I think we fix that by aligning those expectations.
The usual pattern of Said’s comments as I experience them has been (and I think this would be reasonably straightforward to verify)
This lines up with a model where Said is being especially rigorous when it comes to dependencies, and the audience isn’t, and the audience has some random scattering of dependencies where each further reply is only useful to a smaller fraction of the population. It also is explained by people becoming more and more pessimistic that communication will happen, and so not tuning in to the tree to follow things.
This worries me because of double illusion of transparency concerns. That is, one frame we could have here is that Said is virtuously refusing to pretend to understand anything he doesn’t understand. Suppose the version of “authentic” that is necessary to make this post work is actually quite detailed and nuanced, in ways that T3t’s guess don’t quite get at; then it seems like T3t and I might mistakenly believe that communication has taken place when it actually hasn’t, whereas Said and I will have no such illusions.
If there are problems with this situation, I think they come from differing people having different expectations of how bad it is to not have communicated something to Said, and I think we fix that by aligning those expectations.
This lines up with a model where Said is being especially rigorous when it comes to dependencies, and the audience isn’t, and the audience has some random scattering of dependencies where each further reply is only useful to a smaller fraction of the population. It also is explained by people becoming more and more pessimistic that communication will happen, and so not tuning in to the tree to follow things.