Another example I would cite was the response to If Anyone Builds It, Everyone Dies by the core EA people, including among others Will MacAskill himself and also the head of CEA. This was a very clear example of PR mindset, where quite frankly a decision was made that this was a bad EA look, the moves it proposes were unstrategic, and thus the book should be thrown overboard.
FWIW, while I got a vibe like this from the head of CEA’s review, I didn’t get this vibe from Will’s review. The vibe I got from Will’s review was an interest in the arguments and whether they really supported the strong claims of the book.
And the complaints I saw about Will’s review (at least the ones that I was sympathetic to, rather than ones that seemed just very off-base) weren’t “this is insufficiently truth-seeking”. Rather, they were “this is too nitpicky, you should be putting more emphasis on stuff you agree with, because it’s important that readers understand that AI takeover risk is high and the situation isn’t currently being handled well”.
Yeah, that particular part sounded a lot like “I can’t understand why people disagree with IABIED without suffering from PR mindset”.
Like, this would be not out of place if someone couldn’t actually understand or tolerate disagreement with the Grand Ideas, and I really wish this quote was stricken from the post entirely:
Another example I would cite was the response to If Anyone Builds It, Everyone Dies by the core EA people, including among others Will MacAskill himself and also the head of CEA. This was a very clear example of PR mindset, where quite frankly a decision was made that this was a bad EA look, the moves it proposes were unstrategic, and thus the book should be thrown overboard. If Will is sincere about this reckoning, he should be able to recognize that this is what happened.
FWIW, while I got a vibe like this from the head of CEA’s review, I didn’t get this vibe from Will’s review. The vibe I got from Will’s review was an interest in the arguments and whether they really supported the strong claims of the book.
And the complaints I saw about Will’s review (at least the ones that I was sympathetic to, rather than ones that seemed just very off-base) weren’t “this is insufficiently truth-seeking”. Rather, they were “this is too nitpicky, you should be putting more emphasis on stuff you agree with, because it’s important that readers understand that AI takeover risk is high and the situation isn’t currently being handled well”.
Yeah, that particular part sounded a lot like “I can’t understand why people disagree with IABIED without suffering from PR mindset”.
Like, this would be not out of place if someone couldn’t actually understand or tolerate disagreement with the Grand Ideas, and I really wish this quote was stricken from the post entirely: