I have something like mixed feelings about the LW homepage being themed around “If Anyone Builds it, Everyone Dies”:
On the object level, it seems good for people to pre-order and read the book.
On the meta level, it seems like an endorsement of the book’s message. I like LessWrong’s niche as a neutral common space to rigorously discuss ideas (it’s the best open space for doing so that I’m aware of). Endorsing a particular thesis (rather than e.g. a set of norms for discussion of ideas) feels like it goes against this neutrality.
Huh, I personally am kind of hesitant about it, but not because it might cause people to think LessWrong endorses the message. We’ve promoted lots of stuff at the top of the frontpage before, and in-general promote lots of stuff with highly specific object-level takes. Like, whenever we curate something, or we create a spotlight for a post or sequence, we show it to lots of people, and most of the time what we promote is some opinionated object-level perspective.
I agree if this was the only promotion of this kind we have done or will ever do, that it would feel more like we are tipping the scales in some object-level discourse, but it feels very continuous with other kinds of content promotions we have done (and e.g. I am hoping that we will do a kind of similar promotion for some AI 2027 work we are collaborating on with the AI Futures Project, and also for other books that seem high-quality and are written by good authors, like if any of the other top authors on LW were releasing a book, I would be pretty happy to do similar things).
The thing that makes me saddest is that ultimately the thing we are linking and promoting is something that current readers do not have the ability to actually evaluate on their own. It’s a pre-order for a book, not a specific already written piece of content that the reader can evaluate for themselves, right now, and instead the only real things you have to go off of is the social evidence around it, and that makes me sad. I really wish it was possible to share a bunch of excerpts and chapters of the book, which I think would both help with promoting it, and would allow for healthier discourse around it.
Like, in terms of the process that determines what content to highlight, I don’t think promoting the book is an outlier of any kind. I do think the book is just very high-quality (I read a preview copy) and I would obviously curate it if it was a post, independently of its object-level conclusions. I also expect it would score very highly in the annual review, and we would create a spotlight for it, and also do a thing where we promote is as a banner on the right as soon as we got that working for posts (we actually have a draft where we show image banners for a curated list of posts on the right instead of as spotlight items above the post list, but we haven’t gotten it to work reliably with the art we have for the post. It’s a thing I’ve spent over 20 hours working on, which is hopefully some evidence that us promoting the book isn’t some break with our usual content promotion rules).
I really don’t like that the right time for the promotion is in the pre-order stage in this case, and possibly we should just not promote things that people can’t read at least immediately (and maybe never something that isn’t on LessWrong itself), but I feel pretty sad about that line (and e.g. think that something like AI 2027 seems like another good thing to promote similarly).
Maybe the crux is whether the dark color significantly degrades user experience. For me it clearly does, and my guess is that’s what Sam is referring to when he says “What is the LW team thinking? This promo goes far beyond anything they’ve done or that I expected they would do.”
For me, that’s why this promotion feels like a different reference class than seeing the curated posts on the top or seeing ads on the SSC sidebar.
Yes, the dark mode is definitely a more visually intense experience, though the reference class here is not curated posts at the top, but like, previous “giant banner on the right advertising a specific post, or meetup series or the LW books, etc.”.
I do think it’s still more intense than that, and I am going to shipping some easier ways to opt out of that today, just haven’t gotten around to it (like, within 24 hours there should be a button that just gives you back whatever normal color scheme you previously had on the frontpage).
It’s pretty plausible the shift to dark mode is too intense, though that’s really not particularly correlated with this specific promotion, and would just be the result of me having a cool UI design idea that I couldn’t figure out a way to make work on light mode. If I had a similar idea for e.g. promoting the LW books, or LessOnline or some specific review winner, I probably would have done something similar.
If I open LW on my phone, clicking the X on the top right only makes the top banner disappear, but the dark theme remains. Relatedly, if it’s possible to disentangle how the frontpage looks on computer and phone, I would recommend removing the dark theme on phone altogether, you don’t see the cool space visuals on the phone anyway, so the dark theme is just annoying for no reason.
The thing that makes me saddest is that ultimately the thing we are linking and promoting is something that current readers do not have the ability to actually evaluate on their own
This has been nagging at me throughout the promotion of the book. I’ve preordered for myself and two other people, but only with caveats about how I haven’t read the book. I don’t feel comfortable doing more promotion without reading it[1] and it feels kind of bad that I’m being asked to.
I talked to Rob Bensinger about this, and I might be able to get a preview copy if if were a crux for a grand promotional plan, but not for more mild promotion.
What are examples of things that have previously been promoted on the front page? When I saw the IABIED-promo front page, I had an immediate reaction of “What is the LW team thinking? This promo goes far beyond anything they’ve done or that I expected they would do.” Maybe I’m forgetting something, or maybe there are past examples that feel like “the same basic thing” to you, but feel very different to me.
LessOnline (also, see the spotlights at the top for random curated posts):
LessOnline again:
LessWrong review vote:
Best of LessWrong results:
Best of LessWrong results (again):
The LessWrong books:
The HPMOR wrap parties:
Our fundraiser:
ACX Meetups everywhere:
We also either deployed for a bit, or almost deployed, a PR where individual posts that we have spotlights for (which is just a different kind of long-term curation) get shown as big banners on the right. I can’t currently find a screenshot if it, but it looked pretty similar to all the banners you see above for all the other stuff, just promoting individual posts.
To be clear, the current frontpage promotion is a bunch more intense than this!
Mostly this is because Ray/I had a cool UI design idea that we could only make work in dark mode, and so we by default inverted the color scheme for the frontpage, and also just because I got better as a designer and I don’t think I could have pulled off the current design a year ago. If I could do something as intricate/high-effort as this all year round for great content I want to promote, I would do it (and we might still find a way to do that, I still want to permanently publish the spotlight replacement where posts gets highlighted on the right with cool art).
It’s plausible things ended up in too intense of a place for this specific promotion, but if so, that was centrally driven by wanting to do something cool that explores some UI design space, and I don’t think was much correlated with this specific book launch.
Yeah, all of these feel pretty different to me than promoting IABIED.
A bunch of them are about events or content that many LW users will be interested in just by virtue of being LW users (e.g. the review, fundraiser, BoLW results, and LessOnline). I feel similarly about the highlighting of content posted to LW, especially given that that’s a central thing that a forum should do. I think the HPMOR wrap parties and ACX meetups feel slightly worse to me, but not too bad given that they’re just advertising meet-ups.
Why promoting IABIED feels pretty bad to me:
It’s a commercial product—this feels to me like typical advertising that cheapens LW’s brand. (Even though I think it’s very unlikely that Eliezer and Nate paid you to run the frontpage promo or that your motivation was to make them money.)
The book has a very clear thesis that it seems like you’re endorsing as “the official LW position.” Advertising e.g. HPMOR would also feel weird to me, but substantially less so, since HPMOR is more about rationality more generally and overlaps strongly with the sequences, which is centrally LW content. In other words, it feels like you’re implicitly declaring “P(doom) is high” to be a core tenet of LW discourse in the same way that e.g. truth-seeking is.
A bunch of them are about events or content that many LW users will be interested in just by virtue of being LW users (e.g. the review, fundraiser, BoLW results, and LessOnline). I feel similarly about the highlighting of content posted to LW, especially given that that’s a central thing that a forum should do. I think the HPMOR wrap parties and ACX meetups feel slightly worse to me, but not too bad given that they’re just advertising meet-ups.
I would feel quite sad if we culturally weren’t able to promote off-site content. Like, not all the best content in the world is on LW, indeed most of it is somewhere else, and the right sidebar is the place I intentionally carved out to link and promote content that doesn’t fit into existing LW content ontologies, and doesn’t exist e.g. as LW posts.
It seems clear that if any similar author was publishing something I would want to promote it as well. If someone was similarly respected by relevant people, if they published something off-site, whether it’s a fancy beige-standalone-website, or a book, or a movie, or an audiobook, or a video game, if it seems like the kind of thing that LW readers are obviously interested in reading, and I can stand behind quality wise, then it would seem IMO worse for me culturally to have a prohibitions against promoting it just because it isn’t on-site (not obviously, there are benefits to everything promoted going through the same mechanisms of evaluation and voting and annual review, but overall, all things considered, it seems worse to me).
It’s a commercial product—this feels to me like typical advertising that cheapens LW’s brand. (Even though I think it’s very unlikely that Eliezer and Nate paid you to run the frontpage promo or that your motivation was to make them money.)
Yeah, I feel quite unhappy about this too, but I also felt like we broke that Schelling fence with both the LessOnline tickets and the LW fundraiser (which I was both quite sad about). I really would like LW to not feel like a place that is selling you something, or is Out To Get You, and also additional marginal things in that space are costly (and is where a lot of my sadness for this is concentrated in). I really wish the book was just a goddamn freely available website like AI 2027, though I also am in favor of people publishing ideas in a large variety of mediums.
(We did also sell our own books using a really very big frontpage banner, though somehow that feels different because it’s a collection of freely available LW essays, and you can just read them on the website, though we did put a big “buy” button at the top of the site)
The book has a very clear thesis that it seems like you’re endorsing as “the official LW position.” Advertising e.g. HPMOR would also feel weird to me, but substantially less so, since HPMOR is more about rationality more generally and overlaps strongly with the sequences, which is centrally LW content. In other words, it feels like you’re implicitly declaring “P(doom) is high” to be a core tenet of LW discourse in the same way that e.g. truth-seeking is.
I don’t really buy this part. We frequently spotlight and curate posts and content with similarly strong theses that I disagree with in lots of different ways, and I don’t think anyone thinks we endorse that as the “official LW position”.
I agree the promotions for that have been less intense, but I mostly hope to change that going forward in the future. Most of the spotlights we have on the frontpage every day have some kind of strong thesis.
FWIW I also feel a bit bad about it being both commercial and also not literally a LW thing. (Both or neither seems less bad.) However, in this particular case, I don’t actually feel that bad about it—because this is a site founded by Yudkowsky! So it kind of is a LW thing.
We frequently spotlight and curate posts and content with similarly strong theses that I disagree with in lots of different ways, and I don’t think anyone thinks we endorse that as the “official LW position”.
Curating and promoting well-executed LW content—including content that argues for specific theses—feels totally fine to me. (Though I think it would be bad if it were the case that content that argues for favored theses was held to a lower standard.) I guess I view promoting “best of [forum]” content to be a central thing that a forum should do.
It seems like you don’t like this way of drawing boundaries and just want to promote the best content without prejudice for whether it was posted to LW. Maybe if LW had a track record of doing this such that I understood that promoting IABIED as part of a general ethos for content promotion, then I wouldn’t have reacted as strongly. But from my perspective this is one of the first times that you’ve promoted non-LW content, so my guess was that the book was being promoted as an exception to typical norms because you felt it was urgent to promote the book’s message, which felt soldier-mindsetty to me.
(I’d probably feel similarly about an AI 2027 promo, as much as I think they did great work.)
I think you could mitigate this by establishing a stronger track record of promoting excellent off-LW content that is less controversial (e.g. not a commercial product or doesn’t have as strong or divisive a thesis). E.g. you could highlight the void (and not just the LW x-post of it).
I also felt like we broke that Schelling fence with both the LessOnline tickets and the LW fundraiser (which I was both quite sad about).
Even with the norm having already been broken, I think promoting commercial content still carries an additional cost. (Seems like you might agree, but worth stating explicitly.)
I think you could mitigate this by establishing a stronger track record of promoting excellent off-LW content that is less controversial (e.g. not a commercial product or doesn’t have as strong or divisive a thesis). E.g. you could highlight the void (and not just the LW x-post of it).
I think this is kind of fair, but also, I don’t super feel like I want LW to draw that harsh lines here. Ideally we would do more curation of off-site content, and pull off-site content more into the conversation, instead of putting up higher barriers we need to pass to do things with external content.
I do also really think we’ve been planning to do a bunch of this for a while, and mostly been bottlenecked on design capacity, and my guess is within a year we’ll have established more of a track record here that will make you feel more comfortable with our judgement. I think it’s reasonable to have at least some distrust here.
Even with the norm having already been broken, I think promoting commercial content still carries an additional cost. (Seems like you might agree, but worth stating explicitly.)
Fwiw, it feels to me like we’re endorsing the message of the book with this placement. Changing the theme is much stronger than just a spotlight or curation, not to the mention that it’s pre-order promotion.
To clarify here, I think what Habryka says about LW generally promoting lots of content being normal is overwhelmingly true (e.g. spotlights and curation) and this is book is completely typical of what we’d promote to attention, i.e. high quality writing and reasoning. I might say promotion is equivalent to upvote, not to agree-vote.
I still think there details in the promotion here that I think make inferring LW agreement and endorsement reasonable:
lack of disclaimers around disagreement (absence is evidence) together with a good prior that LW team agrees a lot with Eliezer/Nate view on AI risk
promoting during pre-order (which I do find surprising)
that we promoted this in a new way (I don’t think this is as strong evidence as we did before, mostly it’s that we’ve only recently started doing this for events and this is the first book to come along, we might have and will do it for others). But maybe we wouldn’t have or as high-effort absent agreement.
But responding to the OP, rather than motivation coming from narrow endorsement of thesis, I think a bunch of the motivation flows more from a willingness/desire to promote Eliezer[1] content, as (i) such content is reliably very good, and (ii) Eliezer founded LW and his writings make up the core writings that define so much of site culture and norms. We’d likely do the same for another major contributor, e.g. Scott Alexander.
I updated from when I first commented thinking about what we’d do if Eliezer wrote something we felt less agreement over, and I think we’d do much the same. My current assessment is the book placements is something like ~”80-95%” neutral promotion of high-quality content the way we generally do, not because of endorsement, but maybe there’s a 5-20% it got extra effort/prioritization because we in fact endorse the message, but hard to say for sure.
I wonder if we could’ve simply added to the sidebar some text saying “By promoting Soares & Yudkowsky’s new book, we mean to say that it’s a great piece of writing on an important+interesting question by some great LessWrong writers, but are not endorsing the content of the book as ‘true’.”
Or shorter: “This promotion does not imply endorsement of object level claims, simply that we think it’s a good intellectual contribution.”
Or perhaps a longer thing in a hover-over / footnote.
I do think the book is just very high-quality (I read a preview copy) and I would obviously curate it if it was a post, independently of its object-level conclusions.
Would you similarly promote a very high-quality book arguing against AI xrisk by a valued LessWrong member (let’s say titotal)?
I’m fine with the LessWrong team not being neutral about AI xrisk. But I do suspect that this promotion could discourage AI risk sceptics from joining the platform.
Yeah, same as Ben. If Hanson or Scott Alexander wrote something on the topic I disagreed with, but it was similarly well-written, I would be excited to do something similar. Eliezer is of course more core to the site than approximately anyone else, so his authorship weight is heavier, which is part of my thinking on this. I think Bostrom’s Deep Utopia was maybe a bit too niche, but I am not sure, I think pretty plausible I would have done something for that if he had asked.
I’d do it for Hanson, for instance, if it indeed were very high-quality. I expect I’d learn a lot from such a book about economics and futurism and so forth.
I was also concerned about this when the idea first came up, and think it good & natural that you brought it up.
My concerns were assuaged after I noticed I would be similarly happy to promote a broad class of things by excellent bloggers around these parts that would include:
A new book by Bostrom
A new book by Hanson
HPMOR (if it were ever released in physical form, which to be clear I don’t expect to exist)
A Gwern book (which is v unlikely to exist, to be clear)
UNSONG as a book
Like, one of the reasons I’m really excited about this book is the quality of the writing, because Nate & Eliezer are some of the best historical blogging contributors around these parts. I’ve read a chunk of the book and I think it’s really well-written and explains a lot of things very well, and that’s something that would excite me and many readers of LessWrong regardless of topic (e.g. if Eliezer were releasing Inadequate Equilibria or Highly Advanced Epistemology 101 as a book, I would be excited to get the word out about it in this way).
Another relevant factor to consider here is that a key goal with the book is mass-market success in a way that none of the other books I listed are, and so I think it’s going to be more likely that they make this ask. I think it would be somewhat unfortunate if this was the only content that got this sort of promotion, but I hope that this helps others promote to attention that we’re actually up for this for good bloggers/writer, and means we do more of it in the future.
(Added: I view this as similar to the ads that Scott put on the sidebar of SlateStarCodex, which always felt pretty fun & culturally aligned to me.)
As one of the people who worked on the IABIED banner: I do feel like it’s spending down a fairly scarce resource of “LW being a place with ads” (and some adjacent things). I also agree, somewhat contra habryka, that overly endorsing object level ideas is somewhat wonky. We do it with curation, but we also put some effort into using that to promote a variety of ideas of different types, and we sometimes curate things we don’t fully agree with if we think it’s well argued, nd I think it comes across we are more trying to promote “idea quality” there more than a particular agenda.
Counterbalancing that: I dunno man I think this is just really fucking important, and worth spending down some points on.
(I tend to be more hesitant than the rest of the LW team about doing advertisingy things, if I were in charge we would have done somewhat less heavy promotion of LessOnline)
Interesting. To me LessWrong totally does not feel like a neutral space, though not in a way i personally find particularly objectionable. as a social observation, most of the loud people here think that x risk from AI is a very big deal and buy into various clusters of beliefs and if I did not buy into those, I would probably be much less interested in spending time here
More specifically, from the perspective of the Lightcone team, some of them are pretty outspoken and have specific views on safety in the broader eco system, which I sometimes agree with and often disagree with. I’m comfortable disagreeing with them on this site, but it feels odd to consider LessWrong neutral when the people running it have strong public takes
Though maybe you mean neutral in the specific sense of “not using any hard power as a result of running the site to favour viewpoints they like”? Which I largely haven’t observed (though I’m sure there’s some of this in terms of which posts get curated, even if they make an effort to be unbiased) and agree this could be considered an example of
A major factor for me is the extent that they expect the book to bring new life into the conversation about AI Safety. One problem with running a perfectly neutral forum is that people explore 1000 different directions at the cost of moving the conversation forward. There’s a lot of value in terms of focusing people’s attention in the same direction such that progress can be made.
(I downvoted this because it seems like the kind of thing that will spark lots of unproductive discussion. Like in some senses LessWrong is of course a neutral common space. In many ways it isn’t.
I feel like people will just take this statement as some kind of tribal flag. I think there are many good critiques about both what LW should aspire to in terms of neutrality, and what it currently is, but this doesn’t feel like the start of a good conversation about that. If people do want to discuss it I would be very happy to talk about it though.)
This is not straightforward to me: I can’t see how Lesswrong is any less of a neutral or common space as a taxpayer funded, beauracratically governed library, or an algorithmically served news feed on an advertiser-supported platform like Facebook, or “community center” event spaces that are biased towards a community, common only to that community. I’m not sure what your idea of neutrality is, commonality.
Different people will understand it differently! LW is of course aspiring to a bunch of really crucial dimensions of neutrality and discussions of neutrality make up like a solid 2-digit percentage of LessWrong team internal team discussions. We might fail at them, but we definitely aspire to them.
Some ways I really care about neutrality and think LessWrong is neutral:
If the LW team disagrees with someone we don’t ban them or try to censor them, if they follow good norms of discourse
If the LW team team thinks a conclusion is really good for people to arrive at, we don’t promote it beyond the weight for the arguments for that conclusion
We keep voting anonymous to allow people to express opinions about site content without fear of retribution
We try really hard culturally to avoid party lines on object-level issues, and try to keep the site culture focused on shared principles of discussion and inquiry
I could go into the details, but this is indeed the conversation that I felt like wouldn’t go well in this context.
I agree that the banner is in conflict with some aspects of neutrality! Some of which I am sad about, some of which I endorse, some of which I regret (and might still change today or tomorrow).
Of course LessWrong is not just “a website” to me. You can read my now almost full decade of writing and arguing with people about the principles behind LessWrong, and the extremely long history of things like the frontpage/personal distinction which has made many many people who would like to do things like promote their job ads or events or fellowships on our frontpage angry at me.
The website may or may not be neutral, but it’s obvious that the project is not neutral.
Look, the whole reason why this conversation seemed like it would go badly is because you keep using big words without defining them and then asserting absolutes with them. I don’t know what you mean by “the project is not neutral”, and I think the same is true for almost all other readers.
Do you mean that the project is used for local political ends? Do you mean that the project has epistemic standards? Do you mean that the project is corrupt? Do you mean that the project is too responsive to external political forces? Do you mean that the project is arbitrary and unfair in ways that isn’t necessarily the cause of what any individual wants, but still has too much noise to be called “neutral”? I don’t know, all of these are reasonable things someon might mean by “neutrality” in one context, and I don’t really want to have a conversation where people just throw around big words like this without at least some awareness of the ambiguity.
Lesswrong is not neutral because it is built on the principle of where a walled garden ought to be defended from pests and uncharitable principles. Where politics can kill minds. Out of all possible distribution of human interactions we could have on the internet, we pick this narrow band because that’s what makes high quality interaction. It makes us well calibrated (relative to baseline). It makes us more willing to ignore status plays and disagree with our idols.
All these things I love are not neutrality. They are deliberate policies for a less wrong discourse. Lesswrong is all the better because it is not neutral. And just because neutrality is a high-status word where a impartial judge may seem to be—doesn’t mean we should lay claim to it.
FWIW I do aspire to things discussed in Sarah Constantin’s Neutrality essay. For instance, I want it to be true that regardless of whether your position is popular or unpopular, your arguments will be evaluated on their merits on LessWrong. (This can never be perfectly true but I do think it is the case that in comments people primarily respond to arguments with counterarguments rather than with comments about popularity or status and so on, which is not the case in almost any other part of the public internet.)
Fair. In Sarah Constantin’s terminology, it seems you aspire to “potentially take a stand on the controversy, but only when a conclusion emerges from an impartial process that a priori could have come out either way”. I… really don’t know if I’d call that neutrality in the sense of the normal daily usage of neutrality. But I think it is a worthy and good goal.
I have something like mixed feelings about the LW homepage being themed around “If Anyone Builds it, Everyone Dies”:
On the object level, it seems good for people to pre-order and read the book.
On the meta level, it seems like an endorsement of the book’s message. I like LessWrong’s niche as a neutral common space to rigorously discuss ideas (it’s the best open space for doing so that I’m aware of). Endorsing a particular thesis (rather than e.g. a set of norms for discussion of ideas) feels like it goes against this neutrality.
Huh, I personally am kind of hesitant about it, but not because it might cause people to think LessWrong endorses the message. We’ve promoted lots of stuff at the top of the frontpage before, and in-general promote lots of stuff with highly specific object-level takes. Like, whenever we curate something, or we create a spotlight for a post or sequence, we show it to lots of people, and most of the time what we promote is some opinionated object-level perspective.
I agree if this was the only promotion of this kind we have done or will ever do, that it would feel more like we are tipping the scales in some object-level discourse, but it feels very continuous with other kinds of content promotions we have done (and e.g. I am hoping that we will do a kind of similar promotion for some AI 2027 work we are collaborating on with the AI Futures Project, and also for other books that seem high-quality and are written by good authors, like if any of the other top authors on LW were releasing a book, I would be pretty happy to do similar things).
The thing that makes me saddest is that ultimately the thing we are linking and promoting is something that current readers do not have the ability to actually evaluate on their own. It’s a pre-order for a book, not a specific already written piece of content that the reader can evaluate for themselves, right now, and instead the only real things you have to go off of is the social evidence around it, and that makes me sad. I really wish it was possible to share a bunch of excerpts and chapters of the book, which I think would both help with promoting it, and would allow for healthier discourse around it.
Like, in terms of the process that determines what content to highlight, I don’t think promoting the book is an outlier of any kind. I do think the book is just very high-quality (I read a preview copy) and I would obviously curate it if it was a post, independently of its object-level conclusions. I also expect it would score very highly in the annual review, and we would create a spotlight for it, and also do a thing where we promote is as a banner on the right as soon as we got that working for posts (we actually have a draft where we show image banners for a curated list of posts on the right instead of as spotlight items above the post list, but we haven’t gotten it to work reliably with the art we have for the post. It’s a thing I’ve spent over 20 hours working on, which is hopefully some evidence that us promoting the book isn’t some break with our usual content promotion rules).
I really don’t like that the right time for the promotion is in the pre-order stage in this case, and possibly we should just not promote things that people can’t read at least immediately (and maybe never something that isn’t on LessWrong itself), but I feel pretty sad about that line (and e.g. think that something like AI 2027 seems like another good thing to promote similarly).
Maybe the crux is whether the dark color significantly degrades user experience. For me it clearly does, and my guess is that’s what Sam is referring to when he says “What is the LW team thinking? This promo goes far beyond anything they’ve done or that I expected they would do.”
For me, that’s why this promotion feels like a different reference class than seeing the curated posts on the top or seeing ads on the SSC sidebar.
Yes, the dark mode is definitely a more visually intense experience, though the reference class here is not curated posts at the top, but like, previous “giant banner on the right advertising a specific post, or meetup series or the LW books, etc.”.
I do think it’s still more intense than that, and I am going to shipping some easier ways to opt out of that today, just haven’t gotten around to it (like, within 24 hours there should be a button that just gives you back whatever normal color scheme you previously had on the frontpage).
It’s pretty plausible the shift to dark mode is too intense, though that’s really not particularly correlated with this specific promotion, and would just be the result of me having a cool UI design idea that I couldn’t figure out a way to make work on light mode. If I had a similar idea for e.g. promoting the LW books, or LessOnline or some specific review winner, I probably would have done something similar.
@David Matolcsi There is now a button in the top right corner of the frontpage you can click to disable the whole banner!
If I open LW on my phone, clicking the X on the top right only makes the top banner disappear, but the dark theme remains.
Relatedly, if it’s possible to disentangle how the frontpage looks on computer and phone, I would recommend removing the dark theme on phone altogether, you don’t see the cool space visuals on the phone anyway, so the dark theme is just annoying for no reason.
Yep, this is on my to-do list for the day, was just kind of hard to do for dumb backend reasons.
This too is now done.
it’s pretty how much lighter it is than normal, while still being quite dark!
have you a/b tested dark mode on new users? I suspect it would be a better default.
Makes it much harder to see what specific part of a comment a react is responding to, when you hover over it.
That seems like a straightforward bug to me. I didn’t even know that feature was supposed to exist :p
This has been nagging at me throughout the promotion of the book. I’ve preordered for myself and two other people, but only with caveats about how I haven’t read the book. I don’t feel comfortable doing more promotion without reading it[1] and it feels kind of bad that I’m being asked to.
I talked to Rob Bensinger about this, and I might be able to get a preview copy if if were a crux for a grand promotional plan, but not for more mild promotion.
What are examples of things that have previously been promoted on the front page? When I saw the IABIED-promo front page, I had an immediate reaction of “What is the LW team thinking? This promo goes far beyond anything they’ve done or that I expected they would do.” Maybe I’m forgetting something, or maybe there are past examples that feel like “the same basic thing” to you, but feel very different to me.
Some things we promoted in the right column:
LessOnline (also, see the spotlights at the top for random curated posts):
LessOnline again:
LessWrong review vote:
Best of LessWrong results:
Best of LessWrong results (again):
The LessWrong books:
The HPMOR wrap parties:
Our fundraiser:
ACX Meetups everywhere:
We also either deployed for a bit, or almost deployed, a PR where individual posts that we have spotlights for (which is just a different kind of long-term curation) get shown as big banners on the right. I can’t currently find a screenshot if it, but it looked pretty similar to all the banners you see above for all the other stuff, just promoting individual posts.
To be clear, the current frontpage promotion is a bunch more intense than this!
Mostly this is because Ray/I had a cool UI design idea that we could only make work in dark mode, and so we by default inverted the color scheme for the frontpage, and also just because I got better as a designer and I don’t think I could have pulled off the current design a year ago. If I could do something as intricate/high-effort as this all year round for great content I want to promote, I would do it (and we might still find a way to do that, I still want to permanently publish the spotlight replacement where posts gets highlighted on the right with cool art).
It’s plausible things ended up in too intense of a place for this specific promotion, but if so, that was centrally driven by wanting to do something cool that explores some UI design space, and I don’t think was much correlated with this specific book launch.
Yeah, all of these feel pretty different to me than promoting IABIED.
A bunch of them are about events or content that many LW users will be interested in just by virtue of being LW users (e.g. the review, fundraiser, BoLW results, and LessOnline). I feel similarly about the highlighting of content posted to LW, especially given that that’s a central thing that a forum should do. I think the HPMOR wrap parties and ACX meetups feel slightly worse to me, but not too bad given that they’re just advertising meet-ups.
Why promoting IABIED feels pretty bad to me:
It’s a commercial product—this feels to me like typical advertising that cheapens LW’s brand. (Even though I think it’s very unlikely that Eliezer and Nate paid you to run the frontpage promo or that your motivation was to make them money.)
The book has a very clear thesis that it seems like you’re endorsing as “the official LW position.” Advertising e.g. HPMOR would also feel weird to me, but substantially less so, since HPMOR is more about rationality more generally and overlaps strongly with the sequences, which is centrally LW content. In other words, it feels like you’re implicitly declaring “P(doom) is high” to be a core tenet of LW discourse in the same way that e.g. truth-seeking is.
I would feel quite sad if we culturally weren’t able to promote off-site content. Like, not all the best content in the world is on LW, indeed most of it is somewhere else, and the right sidebar is the place I intentionally carved out to link and promote content that doesn’t fit into existing LW content ontologies, and doesn’t exist e.g. as LW posts.
It seems clear that if any similar author was publishing something I would want to promote it as well. If someone was similarly respected by relevant people, if they published something off-site, whether it’s a fancy beige-standalone-website, or a book, or a movie, or an audiobook, or a video game, if it seems like the kind of thing that LW readers are obviously interested in reading, and I can stand behind quality wise, then it would seem IMO worse for me culturally to have a prohibitions against promoting it just because it isn’t on-site (not obviously, there are benefits to everything promoted going through the same mechanisms of evaluation and voting and annual review, but overall, all things considered, it seems worse to me).
Yeah, I feel quite unhappy about this too, but I also felt like we broke that Schelling fence with both the LessOnline tickets and the LW fundraiser (which I was both quite sad about). I really would like LW to not feel like a place that is selling you something, or is Out To Get You, and also additional marginal things in that space are costly (and is where a lot of my sadness for this is concentrated in). I really wish the book was just a goddamn freely available website like AI 2027, though I also am in favor of people publishing ideas in a large variety of mediums.
(We did also sell our own books using a really very big frontpage banner, though somehow that feels different because it’s a collection of freely available LW essays, and you can just read them on the website, though we did put a big “buy” button at the top of the site)
I don’t really buy this part. We frequently spotlight and curate posts and content with similarly strong theses that I disagree with in lots of different ways, and I don’t think anyone thinks we endorse that as the “official LW position”.
I agree the promotions for that have been less intense, but I mostly hope to change that going forward in the future. Most of the spotlights we have on the frontpage every day have some kind of strong thesis.
FWIW I also feel a bit bad about it being both commercial and also not literally a LW thing. (Both or neither seems less bad.) However, in this particular case, I don’t actually feel that bad about it—because this is a site founded by Yudkowsky! So it kind of is a LW thing.
Curating and promoting well-executed LW content—including content that argues for specific theses—feels totally fine to me. (Though I think it would be bad if it were the case that content that argues for favored theses was held to a lower standard.) I guess I view promoting “best of [forum]” content to be a central thing that a forum should do.
It seems like you don’t like this way of drawing boundaries and just want to promote the best content without prejudice for whether it was posted to LW. Maybe if LW had a track record of doing this such that I understood that promoting IABIED as part of a general ethos for content promotion, then I wouldn’t have reacted as strongly. But from my perspective this is one of the first times that you’ve promoted non-LW content, so my guess was that the book was being promoted as an exception to typical norms because you felt it was urgent to promote the book’s message, which felt soldier-mindsetty to me.
(I’d probably feel similarly about an AI 2027 promo, as much as I think they did great work.)
I think you could mitigate this by establishing a stronger track record of promoting excellent off-LW content that is less controversial (e.g. not a commercial product or doesn’t have as strong or divisive a thesis). E.g. you could highlight the void (and not just the LW x-post of it).
Even with the norm having already been broken, I think promoting commercial content still carries an additional cost. (Seems like you might agree, but worth stating explicitly.)
I think this is kind of fair, but also, I don’t super feel like I want LW to draw that harsh lines here. Ideally we would do more curation of off-site content, and pull off-site content more into the conversation, instead of putting up higher barriers we need to pass to do things with external content.
I do also really think we’ve been planning to do a bunch of this for a while, and mostly been bottlenecked on design capacity, and my guess is within a year we’ll have established more of a track record here that will make you feel more comfortable with our judgement. I think it’s reasonable to have at least some distrust here.
Yep, agree.
Fwiw, it feels to me like we’re endorsing the message of the book with this placement. Changing the theme is much stronger than just a spotlight or curation, not to the mention that it’s pre-order promotion.
To clarify here, I think what Habryka says about LW generally promoting lots of content being normal is overwhelmingly true (e.g. spotlights and curation) and this is book is completely typical of what we’d promote to attention, i.e. high quality writing and reasoning. I might say promotion is equivalent to upvote, not to agree-vote.
I still think there details in the promotion here that I think make inferring LW agreement and endorsement reasonable:
lack of disclaimers around disagreement (absence is evidence) together with a good prior that LW team agrees a lot with Eliezer/Nate view on AI risk
promoting during pre-order (which I do find surprising)
that we promoted this in a new way (I don’t think this is as strong evidence as we did before, mostly it’s that we’ve only recently started doing this for events and this is the first book to come along, we might have and will do it for others). But maybe we wouldn’t have or as high-effort absent agreement.
But responding to the OP, rather than motivation coming from narrow endorsement of thesis, I think a bunch of the motivation flows more from a willingness/desire to promote Eliezer[1] content, as (i) such content is reliably very good, and (ii) Eliezer founded LW and his writings make up the core writings that define so much of site culture and norms. We’d likely do the same for another major contributor, e.g. Scott Alexander.
I updated from when I first commented thinking about what we’d do if Eliezer wrote something we felt less agreement over, and I think we’d do much the same. My current assessment is the book placements is something like ~”80-95%” neutral promotion of high-quality content the way we generally do, not because of endorsement, but maybe there’s a 5-20% it got extra effort/prioritization because we in fact endorse the message, but hard to say for sure.
and Nate
I wonder if we could’ve simply added to the sidebar some text saying “By promoting Soares & Yudkowsky’s new book, we mean to say that it’s a great piece of writing on an important+interesting question by some great LessWrong writers, but are not endorsing the content of the book as ‘true’.”
Or shorter: “This promotion does not imply endorsement of object level claims, simply that we think it’s a good intellectual contribution.”
Or perhaps a longer thing in a hover-over / footnote.
Would you similarly promote a very high-quality book arguing against AI xrisk by a valued LessWrong member (let’s say titotal)?
I’m fine with the LessWrong team not being neutral about AI xrisk. But I do suspect that this promotion could discourage AI risk sceptics from joining the platform.
Yeah, same as Ben. If Hanson or Scott Alexander wrote something on the topic I disagreed with, but it was similarly well-written, I would be excited to do something similar. Eliezer is of course more core to the site than approximately anyone else, so his authorship weight is heavier, which is part of my thinking on this. I think Bostrom’s Deep Utopia was maybe a bit too niche, but I am not sure, I think pretty plausible I would have done something for that if he had asked.
I’d do it for Hanson, for instance, if it indeed were very high-quality. I expect I’d learn a lot from such a book about economics and futurism and so forth.
Personally, I don’t have mixed feelings, I just dislike it.
I was also concerned about this when the idea first came up, and think it good & natural that you brought it up.
My concerns were assuaged after I noticed I would be similarly happy to promote a broad class of things by excellent bloggers around these parts that would include:
A new book by Bostrom
A new book by Hanson
HPMOR (if it were ever released in physical form, which to be clear I don’t expect to exist)
A Gwern book (which is v unlikely to exist, to be clear)
UNSONG as a book
Like, one of the reasons I’m really excited about this book is the quality of the writing, because Nate & Eliezer are some of the best historical blogging contributors around these parts. I’ve read a chunk of the book and I think it’s really well-written and explains a lot of things very well, and that’s something that would excite me and many readers of LessWrong regardless of topic (e.g. if Eliezer were releasing Inadequate Equilibria or Highly Advanced Epistemology 101 as a book, I would be excited to get the word out about it in this way).
Another relevant factor to consider here is that a key goal with the book is mass-market success in a way that none of the other books I listed are, and so I think it’s going to be more likely that they make this ask. I think it would be somewhat unfortunate if this was the only content that got this sort of promotion, but I hope that this helps others promote to attention that we’re actually up for this for good bloggers/writer, and means we do more of it in the future.
(Added: I view this as similar to the ads that Scott put on the sidebar of SlateStarCodex, which always felt pretty fun & culturally aligned to me.)
As one of the people who worked on the IABIED banner: I do feel like it’s spending down a fairly scarce resource of “LW being a place with ads” (and some adjacent things). I also agree, somewhat contra habryka, that overly endorsing object level ideas is somewhat wonky. We do it with curation, but we also put some effort into using that to promote a variety of ideas of different types, and we sometimes curate things we don’t fully agree with if we think it’s well argued, nd I think it comes across we are more trying to promote “idea quality” there more than a particular agenda.
Counterbalancing that: I dunno man I think this is just really fucking important, and worth spending down some points on.
(I tend to be more hesitant than the rest of the LW team about doing advertisingy things, if I were in charge we would have done somewhat less heavy promotion of LessOnline)
Interesting. To me LessWrong totally does not feel like a neutral space, though not in a way i personally find particularly objectionable. as a social observation, most of the loud people here think that x risk from AI is a very big deal and buy into various clusters of beliefs and if I did not buy into those, I would probably be much less interested in spending time here
More specifically, from the perspective of the Lightcone team, some of them are pretty outspoken and have specific views on safety in the broader eco system, which I sometimes agree with and often disagree with. I’m comfortable disagreeing with them on this site, but it feels odd to consider LessWrong neutral when the people running it have strong public takes
Though maybe you mean neutral in the specific sense of “not using any hard power as a result of running the site to favour viewpoints they like”? Which I largely haven’t observed (though I’m sure there’s some of this in terms of which posts get curated, even if they make an effort to be unbiased) and agree this could be considered an example of
A major factor for me is the extent that they expect the book to bring new life into the conversation about AI Safety. One problem with running a perfectly neutral forum is that people explore 1000 different directions at the cost of moving the conversation forward. There’s a lot of value in terms of focusing people’s attention in the same direction such that progress can be made.
lesswrong is not a neutral common space.
(I downvoted this because it seems like the kind of thing that will spark lots of unproductive discussion. Like in some senses LessWrong is of course a neutral common space. In many ways it isn’t.
I feel like people will just take this statement as some kind of tribal flag. I think there are many good critiques about both what LW should aspire to in terms of neutrality, and what it currently is, but this doesn’t feel like the start of a good conversation about that. If people do want to discuss it I would be very happy to talk about it though.)
Here are some examples of neutral common spaces:
Libraries
Facebook (usually)
Community center event spaces
Here are some examples of spaces which are not neutral or common:
The alignment forum
The NYT (or essentially any newspaper’s) opinions column
The EA forum
Lesswrong
This seems straightforwardly true to me. I’m not sure what tribe it’s supposed to be a flag for.
This is not straightforward to me:
I can’t see how Lesswrong is any less of a neutral or common space as a taxpayer funded, beauracratically governed library, or an algorithmically served news feed on an advertiser-supported platform like Facebook, or “community center” event spaces that are biased towards a community, common only to that community. I’m not sure what your idea of neutrality is, commonality.
Different people will understand it differently! LW is of course aspiring to a bunch of really crucial dimensions of neutrality and discussions of neutrality make up like a solid 2-digit percentage of LessWrong team internal team discussions. We might fail at them, but we definitely aspire to them.
Some ways I really care about neutrality and think LessWrong is neutral:
If the LW team disagrees with someone we don’t ban them or try to censor them, if they follow good norms of discourse
If the LW team team thinks a conclusion is really good for people to arrive at, we don’t promote it beyond the weight for the arguments for that conclusion
We keep voting anonymous to allow people to express opinions about site content without fear of retribution
We try really hard culturally to avoid party lines on object-level issues, and try to keep the site culture focused on shared principles of discussion and inquiry
I could go into the details, but this is indeed the conversation that I felt like wouldn’t go well in this context.
Okay, this does raise the question of why the “if anyone builds it, everyone dies” frontage?
I think that the difference in how we view this is because to me, lesswrong is a community / intellectual project. To you it’s a website.
The website may or may not be neutral, but it’s obvious that the project is not neutral.
I agree that the banner is in conflict with some aspects of neutrality! Some of which I am sad about, some of which I endorse, some of which I regret (and might still change today or tomorrow).
Of course LessWrong is not just “a website” to me. You can read my now almost full decade of writing and arguing with people about the principles behind LessWrong, and the extremely long history of things like the frontpage/personal distinction which has made many many people who would like to do things like promote their job ads or events or fellowships on our frontpage angry at me.
Look, the whole reason why this conversation seemed like it would go badly is because you keep using big words without defining them and then asserting absolutes with them. I don’t know what you mean by “the project is not neutral”, and I think the same is true for almost all other readers.
Do you mean that the project is used for local political ends? Do you mean that the project has epistemic standards? Do you mean that the project is corrupt? Do you mean that the project is too responsive to external political forces? Do you mean that the project is arbitrary and unfair in ways that isn’t necessarily the cause of what any individual wants, but still has too much noise to be called “neutral”? I don’t know, all of these are reasonable things someon might mean by “neutrality” in one context, and I don’t really want to have a conversation where people just throw around big words like this without at least some awareness of the ambiguity.
I don’t think Cole is wrong.
Lesswrong is not neutral because it is built on the principle of where a walled garden ought to be defended from pests and uncharitable principles. Where politics can kill minds. Out of all possible distribution of human interactions we could have on the internet, we pick this narrow band because that’s what makes high quality interaction. It makes us well calibrated (relative to baseline). It makes us more willing to ignore status plays and disagree with our idols.
All these things I love are not neutrality. They are deliberate policies for a less wrong discourse. Lesswrong is all the better because it is not neutral. And just because neutrality is a high-status word where a impartial judge may seem to be—doesn’t mean we should lay claim to it.
FWIW I do aspire to things discussed in Sarah Constantin’s Neutrality essay. For instance, I want it to be true that regardless of whether your position is popular or unpopular, your arguments will be evaluated on their merits on LessWrong. (This can never be perfectly true but I do think it is the case that in comments people primarily respond to arguments with counterarguments rather than with comments about popularity or status and so on, which is not the case in almost any other part of the public internet.)
Fair. In Sarah Constantin’s terminology, it seems you aspire to “potentially take a stand on the controversy, but only when a conclusion emerges from an impartial process that a priori could have come out either way”. I… really don’t know if I’d call that neutrality in the sense of the normal daily usage of neutrality. But I think it is a worthy and good goal.