(Some) Leverage people used to talk as if they were doing this kind of thing, though it’s not like they let me in on their “optimize other people” planning meetings. I’m not counting chat transcripts that I read of meetings that I wasn’t present for.
Ah, OK, if you meant “see” in the literal sense, then yeah, seems more plausible, but also kind of unclear what its evidential value is. Like, I think you know that it happened a bunch. I agree we don’t want to double count evidence, but I think your message implied that you thought it wasn’t happening, not that it was happening and you just hadn’t seen it.
Well what I’ve seen personally bears on frequently with which this happens.
I think FTX and Leverage are regarded to be particularly bad and outlier-y cases, along several dimensions, including deceptiveness and willingness to cause harm.
If our examples are limited to those two groups, I don’t think that alone justifies saying that it is “routine” in the EA community to “regularly sit down and make extensive plans about how to optimize other people’s beliefs”.
I think you’re making a broader claim that this is common even beyond those particularly extreme examples.
(Some) Leverage people used to talk as if they were doing this kind of thing, though it’s not like they let me in on their “optimize other people” planning meetings. I’m not counting chat transcripts that I read of meetings that I wasn’t present for.
Ah, OK, if you meant “see” in the literal sense, then yeah, seems more plausible, but also kind of unclear what its evidential value is. Like, I think you know that it happened a bunch. I agree we don’t want to double count evidence, but I think your message implied that you thought it wasn’t happening, not that it was happening and you just hadn’t seen it.
Well what I’ve seen personally bears on frequently with which this happens.
I think FTX and Leverage are regarded to be particularly bad and outlier-y cases, along several dimensions, including deceptiveness and willingness to cause harm.
If our examples are limited to those two groups, I don’t think that alone justifies saying that it is “routine” in the EA community to “regularly sit down and make extensive plans about how to optimize other people’s beliefs”.
I think you’re making a broader claim that this is common even beyond those particularly extreme examples.
Totally agree I haven’t established representativeness! I was just talking about what I think the natural implication of your comment was.