Yes, this is me riffing on a popular tweet about coyotes and cats. But it is a pattern that organizations get/extract funding from the EA ecosystem (which has as a big part of its goal to prevent AI takeover) or get talent from EA and then go on to accelerate that development (e.g. OpenAI, Anthropic, now Mechanize Work).
Of course, I agree, it’s such a pattern that it doesn’t look like a joke. It looks like a very compelling true anecdote. And if someone repeats this “very compelling true anecdote” (edit and other people recognize that, no, it’s actually a meme) they’ll make AI alignment worriers look like fools who believe Onion headlines.
Yes, this is me riffing on a popular tweet about coyotes and cats. But it is a pattern that organizations get/extract funding from the EA ecosystem (which has as a big part of its goal to prevent AI takeover) or get talent from EA and then go on to accelerate that development (e.g. OpenAI, Anthropic, now Mechanize Work).
Of course, I agree, it’s such a pattern that it doesn’t look like a joke. It looks like a very compelling true anecdote. And if someone repeats this “very compelling true anecdote” (edit and other people recognize that, no, it’s actually a meme) they’ll make AI alignment worriers look like fools who believe Onion headlines.