On this note, arguing in favor of LLM-generated writing, and using AI-generated images in blog posts, similarly works to degrade the “seriousness” of the x-risk movement. The future is weird. Youngpeopleareweird. There are plenty of normal people making the extinction argument, and anyone can get a statistical model to make claims about human extinction; humans have been independently predicting extinction events for as long as they’ve been writing. Many, especially of the exact target audience for x-risk arguments, have a strong aversion to generative AI images and writing. Saving the movement starts with you.
There are historical ideological movements that can serve as fairly good examples of Yudkowsky’s actions being fine. The free software movement had its share of serious-as-in-societally-acceptable intellectuals, and it had its share of people like Richard Stallman. Critically, both of those groups were making public appearances simultaneously. There is a case to be made that Yudkowsky is a worse representative of his respective movement than Stallman was of his own, and I doubt many people would deny that, but the core value proposition of public-facing figures of this model is not to swallow up the people who would and will be taken by men in business suits and elbow pads, it is to give outliers a place to sink their teeth into. Sometimes this comes in way of putting on a clown costume and playing a jester; sometimes this comes in way of playing yourself in a very authentic, if not socially-acceptable, fashion.
Seriousness is, generally, not worth preserving at all costs. Casting a wide net is less effective and less fragile than casting many nets. In a sea where ideological capture has succeeded so unflinchingly that even billionaires are talking about x-risk, it makes sense to allow for a few people on the sillier side to scope out the frontier of future arguments to be made.
I say this as a person who has no love for Yudkowsky:[1] He is extremely good at memetics targeting outliers, even if (partially) unintentionally. The outliers he’s gathered have, historically, been unusually-effective at spreading his word. He is being an effective activist in this sense. Wearing a silly outfit and banging loudly on his drum is likely as optimal a strategy as he has available to him to make impact.
On this note, arguing in favor of LLM-generated writing, and using AI-generated images in blog posts, similarly works to degrade the “seriousness” of the x-risk movement. The future is weird. Young people are weird. There are plenty of normal people making the extinction argument, and anyone can get a statistical model to make claims about human extinction; humans have been independently predicting extinction events for as long as they’ve been writing. Many, especially of the exact target audience for x-risk arguments, have a strong aversion to generative AI images and writing. Saving the movement starts with you.
There are historical ideological movements that can serve as fairly good examples of Yudkowsky’s actions being fine. The free software movement had its share of serious-as-in-societally-acceptable intellectuals, and it had its share of people like Richard Stallman. Critically, both of those groups were making public appearances simultaneously. There is a case to be made that Yudkowsky is a worse representative of his respective movement than Stallman was of his own, and I doubt many people would deny that, but the core value proposition of public-facing figures of this model is not to swallow up the people who would and will be taken by men in business suits and elbow pads, it is to give outliers a place to sink their teeth into. Sometimes this comes in way of putting on a clown costume and playing a jester; sometimes this comes in way of playing yourself in a very authentic, if not socially-acceptable, fashion.
Seriousness is, generally, not worth preserving at all costs. Casting a wide net is less effective and less fragile than casting many nets. In a sea where ideological capture has succeeded so unflinchingly that even billionaires are talking about x-risk, it makes sense to allow for a few people on the sillier side to scope out the frontier of future arguments to be made.
I say this as a person who has no love for Yudkowsky:[1] He is extremely good at memetics targeting outliers, even if (partially) unintentionally. The outliers he’s gathered have, historically, been unusually-effective at spreading his word. He is being an effective activist in this sense. Wearing a silly outfit and banging loudly on his drum is likely as optimal a strategy as he has available to him to make impact.
Submitted to the reader without further comment as evidence to why: https://flarelang.sourceforge.net/