Promoted to curated: This is a bit of a weird curation given that in some sense this post is the result of a commission from the Lightcone team, but like, we had a good reason for making that commission.
I think building both cultural understanding and personal models about how to interface with AI systems is pretty important, and this feels like one important step in building that understanding. It does really seem like there is a common trap here when people interface with AI systems, and though I expect only a small minority of people on LW to need this exact advice, I do think the majority of readers of this essay will soon come to know people who have fallen into this attractor (whether family or friends or colleagues) and it will hopefully help people deal with that situation better.
Promoted to curated: This is a bit of a weird curation given that in some sense this post is the result of a commission from the Lightcone team, but like, we had a good reason for making that commission.
I think building both cultural understanding and personal models about how to interface with AI systems is pretty important, and this feels like one important step in building that understanding. It does really seem like there is a common trap here when people interface with AI systems, and though I expect only a small minority of people on LW to need this exact advice, I do think the majority of readers of this essay will soon come to know people who have fallen into this attractor (whether family or friends or colleagues) and it will hopefully help people deal with that situation better.
Thank you for writing this!