To clarify here, I think what Habryka says about LW generally promoting lots of content being normal is overwhelmingly true (e.g. spotlights and curation) and this is book is completely typical of what we’d promote to attention, i.e. high quality writing and reasoning. I might say promotion is equivalent to upvote, not to agree-vote.
I still think there details in the promotion here that I think make inferring LW agreement and endorsement reasonable:
lack of disclaimers around disagreement (absence is evidence) together with a good prior that LW team agrees a lot with Eliezer/Nate view on AI risk
promoting during pre-order (which I do find surprising)
that we promoted this in a new way (I don’t think this is as strong evidence as we did before, mostly it’s that we’ve only recently started doing this for events and this is the first book to come along, we might have and will do it for others). But maybe we wouldn’t have or as high-effort absent agreement.
But responding to the OP, rather than motivation coming from narrow endorsement of thesis, I think a bunch of the motivation flows more from a willingness/desire to promote Eliezer[1] content, as (i) such content is reliably very good, and (ii) Eliezer founded LW and his writings make up the core writings that define so much of site culture and norms. We’d likely do the same for another major contributor, e.g. Scott Alexander.
I updated from when I first commented thinking about what we’d do if Eliezer wrote something we felt less agreement over, and I think we’d do much the same. My current assessment is the book placements is something like ~”80-95%” neutral promotion of high-quality content the way we generally do, not because of endorsement, but maybe there’s a 5-20% it got extra effort/prioritization because we in fact endorse the message, but hard to say for sure.
I wonder if we could’ve simply added to the sidebar some text saying “By promoting Soares & Yudkowsky’s new book, we mean to say that it’s a great piece of writing on an important+interesting question by some great LessWrong writers, but are not endorsing the content of the book as ‘true’.”
Or shorter: “This promotion does not imply endorsement of object level claims, simply that we think it’s a good intellectual contribution.”
Or perhaps a longer thing in a hover-over / footnote.
To clarify here, I think what Habryka says about LW generally promoting lots of content being normal is overwhelmingly true (e.g. spotlights and curation) and this is book is completely typical of what we’d promote to attention, i.e. high quality writing and reasoning. I might say promotion is equivalent to upvote, not to agree-vote.
I still think there details in the promotion here that I think make inferring LW agreement and endorsement reasonable:
lack of disclaimers around disagreement (absence is evidence) together with a good prior that LW team agrees a lot with Eliezer/Nate view on AI risk
promoting during pre-order (which I do find surprising)
that we promoted this in a new way (I don’t think this is as strong evidence as we did before, mostly it’s that we’ve only recently started doing this for events and this is the first book to come along, we might have and will do it for others). But maybe we wouldn’t have or as high-effort absent agreement.
But responding to the OP, rather than motivation coming from narrow endorsement of thesis, I think a bunch of the motivation flows more from a willingness/desire to promote Eliezer[1] content, as (i) such content is reliably very good, and (ii) Eliezer founded LW and his writings make up the core writings that define so much of site culture and norms. We’d likely do the same for another major contributor, e.g. Scott Alexander.
I updated from when I first commented thinking about what we’d do if Eliezer wrote something we felt less agreement over, and I think we’d do much the same. My current assessment is the book placements is something like ~”80-95%” neutral promotion of high-quality content the way we generally do, not because of endorsement, but maybe there’s a 5-20% it got extra effort/prioritization because we in fact endorse the message, but hard to say for sure.
and Nate
I wonder if we could’ve simply added to the sidebar some text saying “By promoting Soares & Yudkowsky’s new book, we mean to say that it’s a great piece of writing on an important+interesting question by some great LessWrong writers, but are not endorsing the content of the book as ‘true’.”
Or shorter: “This promotion does not imply endorsement of object level claims, simply that we think it’s a good intellectual contribution.”
Or perhaps a longer thing in a hover-over / footnote.