Presumably you believe that point 2 holds, not just because of the GDP example, but because you’ve seen many, many examples (like health care, which you mention above). Or maybe because you have an analytical argument that the sort of thing that happens with GDP has to generalize to other credit allocation systems?
Both—it would be worrying to have an analytic argument but not notice lots of examples, and it would require much more investigation (and skepticism) if it were happening all the time for no apparent reason.
I tried to gesture at the gestalt of the argument in The Humility Argument for Honesty. Basically, all conflict between intelligent agents contains a large information component, so if we’re fractally at war with each other, we should expect most info channels that aren’t immediately life-support-critical to turn into disinformation, and we should expect this process to accelerate over time.
For examples, important search terms are “preference falsification” and “Gell-Mann amnesia”.
I don’t think I disagree with you on GiveDirectly, except that I suspect you aren’t tracking some important ways your trust chain is likely to make correlated errors along the lines of assuming official statistics are correct. Quick check: what’s your 90% confidence interval for global population, after Googling the official number, which is around 7.7 billion?
except that I suspect you aren’t tracking some important ways your trust chain is likely to make correlated errors along the lines of assuming official statistics are correct.
Interesting.
Quick check: what’s your 90% confidence interval for global population, after Googling the official number, which is around 7.7 billion?
I don’t know, certainly not off by more than a half billion in either direction? I don’t know how hard it is to estimate the number of people on earth. It doesn’t seem like there’s much incentive to mess with the numbers here.
It doesn’t seem like there’s much incentive to mess with the numbers here.
Guessing at potential comfounders—There may be incentives for individual countries (or cities) to inflate their numbers (to seem more important) – or, deflate their numbers, to avoid taxes.
Both—it would be worrying to have an analytic argument but not notice lots of examples, and it would require much more investigation (and skepticism) if it were happening all the time for no apparent reason.
I tried to gesture at the gestalt of the argument in The Humility Argument for Honesty. Basically, all conflict between intelligent agents contains a large information component, so if we’re fractally at war with each other, we should expect most info channels that aren’t immediately life-support-critical to turn into disinformation, and we should expect this process to accelerate over time.
For examples, important search terms are “preference falsification” and “Gell-Mann amnesia”.
I don’t think I disagree with you on GiveDirectly, except that I suspect you aren’t tracking some important ways your trust chain is likely to make correlated errors along the lines of assuming official statistics are correct. Quick check: what’s your 90% confidence interval for global population, after Googling the official number, which is around 7.7 billion?
Interesting.
I don’t know, certainly not off by more than a half billion in either direction? I don’t know how hard it is to estimate the number of people on earth. It doesn’t seem like there’s much incentive to mess with the numbers here.
Guessing at potential comfounders—There may be incentives for individual countries (or cities) to inflate their numbers (to seem more important) – or, deflate their numbers, to avoid taxes.