This alone wouldn’t be an issue, but it is since I want to consistently write my ideas down for a public audience. I frequently read on very niche topics, and comment frequently on the r/slatestarcodex subreddit, sometimes in comment chains totaling thousands of words. The ideas discussed are usually quite half-baked, but I think can be refined into something that other people would want to read, while also allowing me to clarify my own opinions in a more formal manner than how they exist in my head.
The guy who wrote the Why I’m not a Rationalist article that some of you might be aware of wrote a follow up article yesterday, largely centered around a comment I made. He has this to say about my Schelling article; “Ironically, this commenter has some of the most well written and in-depth content I’ve seen on this website. Go figure.”
This has left me conflicted. On one hand, I haven’t really written anything in the past few months because I’m trying to contend with how I can actually write something “good” without relying so heavily on AI. On the other, if people are seeing this lazily edited article as some of the most well written and in-depth content on Substack, maybe it’s fine? If I just put in a little more effort for post-editing, cleaning up the em dashes and standard AI comparisons (It’s not just this, it’s this), I think I’d be able to write a lot more frequently, and in higher quality than I would be able to do on my own. I was a solid ~B+ English student, so I’m well aware that my writing skill isn’t anything exemplary.
I even agree with the conclusion of this article. That when someone notices they’re reading something written or edited by AI, it’s a serious negative signal and probably not worth spending the time to read more. I even got into a discussion earlier this week with someone who used AI to edit their book expressing that exact same sentiment.
So what do I do here? I want to write things, but I don’t seem to be able to do so well on my own. What I “wrote” with AI seems to have been good enough to attract people to read it (and at the very least I think I can say all the ideas communicated were my own, not GPT’s), so why not write more with it? For someone to say it’s some of the most well written and in-depth content is somewhat depressing, since it means that the AI writing, and not my own writing, is what has attracted people, but if that’s what the people like, who am I to disagree?
As far as improving my writing style, I read frequently, I try to comment an intelligent thought on everything I read (either in the margins of a book, or the comment section underneath an essay), but what more can I do? If this is a process that won’t leave me a good writer within the next ~5 years, won’t AI just get better at writing by then anyway, so wouldn’t it make more sense to get used to utilizing AI for my writing now?
Apologies if this is unrelated, but I’ve been thinking about this since the blog post I mentioned yesterday, and the advice on the bottom of this post seems relevant to my situation.
I think if you demonstrate unusual skill at recognizing and curating excellent writing, it matters much less where that writing came from.
As a compromise, have you considered making your best effort at a post before submitting it to AI, and then soliciting writing style/quality critique? If you combine the request for critique with a clear description of the specific areas you’re working on, it’ll probably do especially well at connecting your goals to your opportunities. This seems like the approach most likely to enhance the quality of the writing that you independently produce.
I think you make good points. That’s wrong is particularly concerned with the intrusion of AI slop because the whole point of this community or at least most of it is epistemic quality; it’s here so that people can become less wrong. Allowing AI writing in is a strong signal that we’re allowing AI thinking in, and AI isn’t good enough to produce high quality new ideas and hypotheses yet.
For other audiences, I think using AI to help you write is much less of a red flag because they don’t share those same reasons. And I think that use of AI for writing is a separate matter than using it to produce the ideas in the writing. But it’s very easy for those two to bleed together, which is why Les wrong is going to remain extremely suspicious of AI writing. But if you are being careful that the idea is are yours and using AI only to help you write, I think for many purposes it may really be good writing and I for one endorse you taking that route. Don’t do it unless wrong because we’ve been asked not to, but in other places less concerned with epistemic purity I think using AI to help you write is going to become the de facto standard.
As Zvi says, you can use AI to help you learn or you can use AI to avoid learning. Keep an eye on what you’re doing.
Yeah, this is hard. Outside the (narrowly construed) LW bubble, I see LLM-generated text ~everywhere, for example a friend sent me an ad he saw on facebook for the picture/product, and the text was super obviously created by AI. I think mostly people don’t notice it, and even prefer it to uninspired non-AI-generated text.
(I am sure there are other bubbles than LW out there that react badly to AI-generated text, and perhaps there’s a notable correlation between those bubbles and ones I’d consider good to be in.)
But if you’re just sort of looking for higher engagement/more attention/to get your ideas out there to the public, yeah, it’s tough to prove that AI usage (for writing copy) is an error. For whatever reason, lots of people like writing that hammers its thesis over and over in emotive ways, uses superficial contrasts to create artificial tension, and ironically uses “and that’s important” as unimportant padding. In my mind I think of this as “the twitter style” and it annoys me even when it’s clearly human-generated, but RLHF and the free market of Twitter both think it’s maximally fit, so, well, here we are.
In terms of “why bother learn to write” more generally, I guess I would take that a level up. Why bother to blog? If it’s in service of the ideas themselves, I think writing on one’s own is valuable for similar reasons as “helping spread cool ideas”—it’s virtuous and helps you learn to think more clearly. I wouldn’t want to use AI to generate my writing in part because I’d like to look back at my own writing and smile at a job well done, and when I see AI-generated writing I do a little frown and want to skim. But if you don’t value writing for its own sake, and it’s solely a means to an end, and that end is best served by a generic audience of modal humans, then, oof. Maybe o3 is superhuman for this. Or maybe not; perhaps your post would have done even better (on the metrics) if it was 60% shorter and written entirely by you. I suppose we’ll never know.
(I liked the personal parts of the post, by the way. Like your alarm clock anecdote, say. But I liked it specifically because it’s true, and thus an interesting insight into how humans quite different than me behave. I’d be significantly annoyed if it were fabricated, and extra double annoyed if it were fabricated by an LLM.)
With regards to using AI to write and also becoming a better writer you may consider some recent evidence based on EEG brain scans of people completing an essay-writing task both with and without AI ( https://arxiv.org/abs/2506.08872 ). These results suggest it is best for our cognitive development if we make an effort at writing without AI first. Participants with the most robust neural activity, e.g. engaging deep semantic networks of the brain, first wrote with only their brains and then returned to that same essay topic with an AI assistant which they used mainly for information seeking and inquiry.
As to why you might invest in writing as a skill to develop for yourself you may consider what exactly is the purpose and metric of writing. If you are looking to strengthen your own inner resources and capabilities, to deepen your critical thinking and cognitive potency, then the evidence cite above suggests you practice focused and effortful writing using your own brain. That same study suggests you may not only fail to develop as a writer and thinking if you use AI for writing, but that you may become a worse writer and critical thinker as a result of offloading your cognitive load to the AI. If, however, you goal is to gain attention and approval then a tool such as AI may be a faster and more reliable path to that. It depends on what your goals are as a human being and as a writer.
It’s not an easy answer. I’m a self-interested person, and I realized a while ago that many of my most productive and interesting relationships, both personal and in business, are the direct result of my activity on the internet. I already waste a lot of time commenting my thoughts, sometimes in long form, so I figure if I’m going to be reacting to stuff publicly, I might as well do so in the form of a blog where others might pick up on it. If that results in something good for me, influence, relationships, demonstration of niche intellectual ability the right sort of people in this world people find interesting, then that’s not a small part of my motivation.
At the same time I have more naive views about the virtue of just doing things for their own sake. Writing is definitely an excellent tool for fixing your own thought, as it forces you to communicate in a way that makes sense to other people, thus causing your own ideas to make sense to you. The problem with this line of thinking is that I’ve never been an exemplary writer in any sense, although hopefully I am better and more self-motivated than I used to be. What I can currently write in long-form unassisted I’m not satisfied with, which causes a sort of writers block that I really hate.
I’m integrating the advice of other people into what I’m planning to do, and hopefully with enough effort I’ll be able to produce (with critique but not rewriting by AI) something that satisfied both my desire to write for its own sake, while also producing something that other people might actually want to read. Also, I have the annoying consideration of being time- efficient. I by no means spend my time maximally efficiently, but struggling through writing burns a lot of my willpower points that ends up consuming a lot of time elsewhere.
This post is timed perfectly for my own issue with writing using AI. Maybe some of you smart people can offer advice.
Back in March I wrote a 7,000 word blog post about The Strategy of Conflict by Thomas Schelling. It did decently well considering the few subscribers I have, but the problem is that it was (somewhat obviously) written in huge part with AI. Here’s the conversation I had with ChatGPT. It took me about 3 hours to write.
This alone wouldn’t be an issue, but it is since I want to consistently write my ideas down for a public audience. I frequently read on very niche topics, and comment frequently on the r/slatestarcodex subreddit, sometimes in comment chains totaling thousands of words. The ideas discussed are usually quite half-baked, but I think can be refined into something that other people would want to read, while also allowing me to clarify my own opinions in a more formal manner than how they exist in my head.
The guy who wrote the Why I’m not a Rationalist article that some of you might be aware of wrote a follow up article yesterday, largely centered around a comment I made. He has this to say about my Schelling article; “Ironically, this commenter has some of the most well written and in-depth content I’ve seen on this website. Go figure.”
This has left me conflicted. On one hand, I haven’t really written anything in the past few months because I’m trying to contend with how I can actually write something “good” without relying so heavily on AI. On the other, if people are seeing this lazily edited article as some of the most well written and in-depth content on Substack, maybe it’s fine? If I just put in a little more effort for post-editing, cleaning up the em dashes and standard AI comparisons (It’s not just this, it’s this), I think I’d be able to write a lot more frequently, and in higher quality than I would be able to do on my own. I was a solid ~B+ English student, so I’m well aware that my writing skill isn’t anything exemplary.
I even agree with the conclusion of this article. That when someone notices they’re reading something written or edited by AI, it’s a serious negative signal and probably not worth spending the time to read more. I even got into a discussion earlier this week with someone who used AI to edit their book expressing that exact same sentiment.
So what do I do here? I want to write things, but I don’t seem to be able to do so well on my own. What I “wrote” with AI seems to have been good enough to attract people to read it (and at the very least I think I can say all the ideas communicated were my own, not GPT’s), so why not write more with it? For someone to say it’s some of the most well written and in-depth content is somewhat depressing, since it means that the AI writing, and not my own writing, is what has attracted people, but if that’s what the people like, who am I to disagree?
As far as improving my writing style, I read frequently, I try to comment an intelligent thought on everything I read (either in the margins of a book, or the comment section underneath an essay), but what more can I do? If this is a process that won’t leave me a good writer within the next ~5 years, won’t AI just get better at writing by then anyway, so wouldn’t it make more sense to get used to utilizing AI for my writing now?
Apologies if this is unrelated, but I’ve been thinking about this since the blog post I mentioned yesterday, and the advice on the bottom of this post seems relevant to my situation.
I think if you demonstrate unusual skill at recognizing and curating excellent writing, it matters much less where that writing came from.
As a compromise, have you considered making your best effort at a post before submitting it to AI, and then soliciting writing style/quality critique? If you combine the request for critique with a clear description of the specific areas you’re working on, it’ll probably do especially well at connecting your goals to your opportunities. This seems like the approach most likely to enhance the quality of the writing that you independently produce.
Seconding this. In my experience, LLMs are better at generating critique than main text.
I think you make good points. That’s wrong is particularly concerned with the intrusion of AI slop because the whole point of this community or at least most of it is epistemic quality; it’s here so that people can become less wrong. Allowing AI writing in is a strong signal that we’re allowing AI thinking in, and AI isn’t good enough to produce high quality new ideas and hypotheses yet.
For other audiences, I think using AI to help you write is much less of a red flag because they don’t share those same reasons. And I think that use of AI for writing is a separate matter than using it to produce the ideas in the writing. But it’s very easy for those two to bleed together, which is why Les wrong is going to remain extremely suspicious of AI writing. But if you are being careful that the idea is are yours and using AI only to help you write, I think for many purposes it may really be good writing and I for one endorse you taking that route. Don’t do it unless wrong because we’ve been asked not to, but in other places less concerned with epistemic purity I think using AI to help you write is going to become the de facto standard.
As Zvi says, you can use AI to help you learn or you can use AI to avoid learning. Keep an eye on what you’re doing.
Yeah, this is hard. Outside the (narrowly construed) LW bubble, I see LLM-generated text ~everywhere, for example a friend sent me an ad he saw on facebook for the picture/product, and the text was super obviously created by AI. I think mostly people don’t notice it, and even prefer it to uninspired non-AI-generated text.
(I am sure there are other bubbles than LW out there that react badly to AI-generated text, and perhaps there’s a notable correlation between those bubbles and ones I’d consider good to be in.)
But if you’re just sort of looking for higher engagement/more attention/to get your ideas out there to the public, yeah, it’s tough to prove that AI usage (for writing copy) is an error. For whatever reason, lots of people like writing that hammers its thesis over and over in emotive ways, uses superficial contrasts to create artificial tension, and ironically uses “and that’s important” as unimportant padding. In my mind I think of this as “the twitter style” and it annoys me even when it’s clearly human-generated, but RLHF and the free market of Twitter both think it’s maximally fit, so, well, here we are.
In terms of “why bother learn to write” more generally, I guess I would take that a level up. Why bother to blog? If it’s in service of the ideas themselves, I think writing on one’s own is valuable for similar reasons as “helping spread cool ideas”—it’s virtuous and helps you learn to think more clearly. I wouldn’t want to use AI to generate my writing in part because I’d like to look back at my own writing and smile at a job well done, and when I see AI-generated writing I do a little frown and want to skim. But if you don’t value writing for its own sake, and it’s solely a means to an end, and that end is best served by a generic audience of modal humans, then, oof. Maybe o3 is superhuman for this. Or maybe not; perhaps your post would have done even better (on the metrics) if it was 60% shorter and written entirely by you. I suppose we’ll never know.
(I liked the personal parts of the post, by the way. Like your alarm clock anecdote, say. But I liked it specifically because it’s true, and thus an interesting insight into how humans quite different than me behave. I’d be significantly annoyed if it were fabricated, and extra double annoyed if it were fabricated by an LLM.)
With regards to using AI to write and also becoming a better writer you may consider some recent evidence based on EEG brain scans of people completing an essay-writing task both with and without AI ( https://arxiv.org/abs/2506.08872 ). These results suggest it is best for our cognitive development if we make an effort at writing without AI first. Participants with the most robust neural activity, e.g. engaging deep semantic networks of the brain, first wrote with only their brains and then returned to that same essay topic with an AI assistant which they used mainly for information seeking and inquiry.
As to why you might invest in writing as a skill to develop for yourself you may consider what exactly is the purpose and metric of writing. If you are looking to strengthen your own inner resources and capabilities, to deepen your critical thinking and cognitive potency, then the evidence cite above suggests you practice focused and effortful writing using your own brain. That same study suggests you may not only fail to develop as a writer and thinking if you use AI for writing, but that you may become a worse writer and critical thinker as a result of offloading your cognitive load to the AI. If, however, you goal is to gain attention and approval then a tool such as AI may be a faster and more reliable path to that. It depends on what your goals are as a human being and as a writer.
Thank you for the article. I’ll give it a read.
It’s not an easy answer. I’m a self-interested person, and I realized a while ago that many of my most productive and interesting relationships, both personal and in business, are the direct result of my activity on the internet. I already waste a lot of time commenting my thoughts, sometimes in long form, so I figure if I’m going to be reacting to stuff publicly, I might as well do so in the form of a blog where others might pick up on it. If that results in something good for me, influence, relationships, demonstration of niche intellectual ability the right sort of people in this world people find interesting, then that’s not a small part of my motivation.
At the same time I have more naive views about the virtue of just doing things for their own sake. Writing is definitely an excellent tool for fixing your own thought, as it forces you to communicate in a way that makes sense to other people, thus causing your own ideas to make sense to you. The problem with this line of thinking is that I’ve never been an exemplary writer in any sense, although hopefully I am better and more self-motivated than I used to be. What I can currently write in long-form unassisted I’m not satisfied with, which causes a sort of writers block that I really hate.
I’m integrating the advice of other people into what I’m planning to do, and hopefully with enough effort I’ll be able to produce (with critique but not rewriting by AI) something that satisfied both my desire to write for its own sake, while also producing something that other people might actually want to read. Also, I have the annoying consideration of being time- efficient. I by no means spend my time maximally efficiently, but struggling through writing burns a lot of my willpower points that ends up consuming a lot of time elsewhere.