Well what I’ve seen personally bears on frequently with which this happens.
I think FTX and Leverage are regarded to be particularly bad and outlier-y cases, along several dimensions, including deceptiveness and willingness to cause harm.
If our examples are limited to those two groups, I don’t think that alone justifies saying that it is “routine” in the EA community to “regularly sit down and make extensive plans about how to optimize other people’s beliefs”.
I think you’re making a broader claim that this is common even beyond those particularly extreme examples.
Well what I’ve seen personally bears on frequently with which this happens.
I think FTX and Leverage are regarded to be particularly bad and outlier-y cases, along several dimensions, including deceptiveness and willingness to cause harm.
If our examples are limited to those two groups, I don’t think that alone justifies saying that it is “routine” in the EA community to “regularly sit down and make extensive plans about how to optimize other people’s beliefs”.
I think you’re making a broader claim that this is common even beyond those particularly extreme examples.
Totally agree I haven’t established representativeness! I was just talking about what I think the natural implication of your comment was.