Feedback Requested! Draft of a New About/​Welcome Page for LessWrong

Con­text for Draft /​ Re­quest for Feedback

The LessWrong team is hop­ing to soon dis­play a new About/​Wel­come page which does an im­proved job of con­vey­ing what is about and how com­mu­nity mem­bers can pro­duc­tively use the site.

How­ever, LessWrong is a com­mu­nity site and I (plus the team) feel it’s not ap­pro­pri­ately for us to unilat­er­ally de­clare what LessWrong is about. So here’s our in-progress draft of a new About/​Wel­come page. Please let us know what you think in the com­ments. Please es­pe­cially let us know if you think LessWrong is ac­tu­ally about some­thing else. Or even just what it means to you.


<3 Ruby



The tl;dr

LessWrong is a com­mu­nity blog de­voted to the art of hu­man ra­tio­nal­ity.

We in­vite you to use this site for any num­ber of rea­sons, in­clud­ing, but not limited to: learn­ing valuable things, be­ing en­ter­tained, shar­ing and get­ting feed­back on your ideas, and par­ti­ci­pat­ing in a com­mu­nity you like. How­ever, fun­da­men­tally, this site is de­signed for two main uses:

  • As a place to level-up your rationality

  • As a place to ap­ply your ra­tio­nal­ity to im­por­tant real-world problems

Pri­mary things to do on LessWrong are:

Level­ing up your rationality

First off, what is ra­tio­nal­ity?

Ra­tion­al­ity is a term which can have differ­ent con­no­ta­tions to differ­ent peo­ple. On LessWrong, we mean some­thing like the fol­low­ing:

  • Ra­tion­al­ity is think­ing in ways which sys­tem­at­i­cally ar­rive at truth.

  • Ra­tion­al­ity is think­ing in ways which cause you to achieve your goals.

  • Ra­tion­al­ity is try­ing to do bet­ter on pur­pose.

  • Ra­tion­al­ity is rea­son­ing well even in the face of mas­sive un­cer­tainty.

  • Ra­tion­al­ity is mak­ing good de­ci­sions even when it’s hard.

  • Ra­tion­al­ity is be­ing self-aware, un­der­stand­ing how your own mind works, and ap­ply­ing this knowl­edge to think­ing bet­ter.

What ra­tio­nal­ity is not:

  • For­sak­ing all hu­man emo­tion and in­tu­ition to em­brace Cold Hard Logic.

Why should I care about ra­tio­nal­ity?

One rea­son to care about ra­tio­nal­ity is be­cause you in­trin­si­cally care about hav­ing true be­liefs. You might also care about ra­tio­nal­ity be­cause you care about any­thing at all. Our abil­ity to achieve our goals de­pends on 1) our abil­ity to abil­ity to un­der­stand and pre­dict the world, 2) hav­ing the skills to make good plans, and 3) hav­ing the self-knowl­edge and self-mas­tery to avoid fal­ling into com­mon pit­falls of hu­man think­ing. Th­ese are core top­ics in ra­tio­nal­ity are of in­ter­est to any­one with non-triv­ial goals, from cur­ing their per­sis­tent in­som­nia and hav­ing fulfilling re­la­tion­ships to perform­ing ground­break­ing re­search or cur­ing the world’s great­est ills.

See also Why truth? And...

How does LessWrong help me level up my ra­tio­nal­ity?

A repos­i­tory of ra­tio­nal­ity knowledge

LessWrong has an ex­ten­sive Library con­tain­ing hun­dreds of es­says on ra­tio­nal­ity top­ics. You can get started on the Library page or from the home­page. Among the newer ma­te­rial, we par­tic­u­larly recom­mend Cu­rated posts.

The writ­ings of Eliezer Yud­kowsky and Scott Alexan­der com­prise the core read­ings of LessWrong. As part of the found­ing of LessWrong, Eliezer Yud­kowsky wrote a long se­ries of blog posts, origi­nally known as The Se­quences and more re­cently com­piled into an ed­ited vol­ume, Ra­tion­al­ity: AI to Zom­bies.

Ra­tion­al­ity: From AI to Zom­bies is a deep ex­plo­ra­tion of how hu­mans minds can come to un­der­stand the world they ex­ist in—and all rea­sons they so of­ten fail to do so. The com­pre­hen­sive work:

Eliezer cov­ers these top­ics and many more through alle­gory, anec­dote, and sci­en­tific the­ory. He tests these ideas by ap­ply­ing them to de­bates in ar­tifi­cial in­tel­li­gence (AI), physics, metaethics, and con­scious­ness.

Eliezer also wrote Harry Pot­ter and the Meth­ods of Ra­tion­al­ity (HPMOR), an al­ter­na­tive uni­verse ver­sion of Harry Pot­ter where Harry’s adop­tive par­ents raised with En­light­en­ment ideals and the ex­per­i­men­tal spirits. This work in­tro­duces many of the ideas from Ra­tion­al­ity: A-Z in a grip­ping nar­ra­tive.

Scott Alexan­der’s es­says on how good rea­son­ing works, how to learn from the in­sti­tu­tion of sci­ence, and the differ­ent ways so­ciety has been and could be or­ga­nized have been made into a col­lec­tion called The Codex. The Codex con­tains such ex­em­plary es­says as:

Mem­bers on LessWrong rely on many of the ideas from their writ­ers in their own posts, and so it’s ad­vised to read at least a lit­tle of these au­thors to get up to speed on LessWrong’s back­ground knowl­edge and cul­ture.

Truth-seek­ing norms and culture

We are proud of the LessWrong com­mu­nity not just for its study of ra­tio­nal­ity, but also for how much these ideals and skills are put into prac­tice. Un­like many so­cial spaces on the mod­ern In­ter­net, LessWrong is a place where chang­ing your mind, char­i­ta­bil­ity, schol­ar­ship, and many other virtues are cher­ished. LessWrong helps you im­prove you ra­tio­nal­ity by pro­vid­ing a space where healthy epistemic and con­ver­sa­tional norms are en­couraged and en­forced.

So­cial sup­port and reinforcement

Beyond cul­ture and norms, it’s eas­ier to learn, change, and grow when you’re not alone on your path. Find soli­dar­ity on your quest for greater ra­tio­nal­ity with the LessWrong com­mu­nity. You can par­ti­ci­pate in the con­ver­sa­tions on­line (via the com­ments or writ­ing posts which build on the posts of oth­ers). Or at­tend a lo­cal in-per­son meetup, con­fer­ence, or com­mu­nity cel­e­bra­tion. In the last twelve months, there have been 461 mee­tups in 32 coun­tries.

Op­por­tu­ni­ties to prac­tice your ra­tio­nal­ity.

See the next sec­tion.

Ap­ply­ing your ra­tio­nal­ity to im­por­tant problems

Feed­back and prac­tice are cru­cial for mas­tery of skills. If you’re not us­ing your skills to do any­thing real, how do you even know whether you’re on the right track? For this rea­son, LessWrong is a place where ra­tio­nal­ity is both trained and put to use.

Plus, it’s nice to ac­com­plish real things.

Ways to ap­ply your ra­tio­nal­ity on LessWrong

Par­ti­ci­pate in dis­cus­sions aimed at truth-seek­ing and self-improvement

On LessWrong, you can con­verse with oth­ers with the real goal of ex­chang­ing be­liefs and con­verg­ing on the truth. You can delight in di­a­log which isn’t about Be­ing Right, but ac­tu­ally in clar­ify­ing the mat­ter at hand. And you can work to­gether with oth­ers, each of you pro­vid­ing your own un­der­stand­ing and back­ground knowl­edge to figure out how re­al­ity re­ally is. This is not In­ter­net dis­cus­sion as you know it.

While ra­tio­nal­ity, self-im­prove­ment, and AI are the most fre­quently dis­cussed top­ics on the site, there are also com­monly dis­cus­sions of self-im­prove­ment, psy­chol­ogy, philos­o­phy, de­ci­sion the­ory, math­e­mat­ics, com­puter sci­ence, physics, biol­ogy, his­tory, so­ciol­ogy, med­i­ta­tion, and many other top­ics.

Core to LessWrong is that we want our on­line con­ver­sa­tions to be pro­duc­tive, con­struc­tive, and ori­ented around de­ter­min­ing what is true. Our Front­page com­ment­ing guidelines ask mem­bers to:

Aim to ex­plain, not per­suade. Write your true rea­sons for be­liev­ing some­thing, not what you think is most likely to per­suade oth­ers. Try to offer con­crete mod­els, make pre­dic­tions, and note what would change your mind.
Pre­sent your own per­spec­tive. Make per­sonal state­ments in­stead of state­ments that try to rep­re­sent a group con­sen­sus (“I think X is wrong” vs. “X is gen­er­ally frowned upon”). Avoid stereo­typ­i­cal ar­gu­ments that will cause oth­ers to round you off to some­one else they’ve en­coun­tered be­fore. Tell peo­ple how you think about a topic, in­stead of re­peat­ing some­one else’s ar­gu­ments (e.g. “But Nick Bostrom says…”).
Get cu­ri­ous. If I dis­agree with some­one, what might they be think­ing; what are the mov­ing parts of their be­liefs? What model do I think they are run­ning? Ask your­self—what about this topic do I not un­der­stand? What ev­i­dence could I get, or what ev­i­dence do I already have?

Once you’ve read some of LessWrong’s core ma­te­rial and read through some past com­ment-sec­tion dis­cus­sions to get a sense of how we com­mu­ni­cate around here, you’re ready to par­ti­ci­pate in a LessWrong dis­cus­sion.

Post your valuable ideas

Our col­lec­tive knowl­edge and skills are solid­ified by mem­bers writ­ing posts. By writ­ing posts, you benefit the world by shar­ing your knowl­edge and benefit your­self by get­ting feed­back from an au­di­ence. Our au­di­ence will hold you to high stan­dards of rea­son­ing, yet in a co­op­er­a­tive and en­courag­ing man­ner.

Posts on prac­ti­cally any topic are wel­comed. We think it’s im­por­tant that mem­bers can “bring their en­tire selves” to LessWrong and are able to share their thoughts, ideas, and ex­pe­riences with­out fear­ing whether they are “on topic”. Ra­tion­al­ity is not re­stricted to only spe­cific do­mains in one’s life, and nei­ther should LessWrong be.

How­ever, to main­tain its over­all fo­cus, LessWrong clas­sifies posts as ei­ther Per­sonal blog­posts or as Front­page posts. The lat­ter have more visi­bil­ity by de­fault on the site.

All posts be­gin as per­sonal blog­posts. Authors can grant per­mis­sion to LessWrong’s mod­er­a­tion team to give a post Front­page sta­tus if it i) has broad rele­vance to LessWrong’s mem­bers, ii) is time­less, i.e. not tied to cur­rent events, and iii) pri­mar­ily at­tempts to ex­plain rather than per­suade.

The not-perfectly-named cat­e­gory of “Per­sonal” blog­posts are suit­able for ev­ery­thing which doesn’t fit in Front­page. It’s the right clas­sifi­ca­tion for dis­cus­sions of niche top­ics, per­sonal in­ter­ests, cur­rent events, com­mu­nity con­cerns, po­ten­tially di­vi­sive top­ics, and just about any­thing else you want to write about.

See more in Site Guide: Per­sonal Blog­posts vs Front­page Posts

Con­tri­bu­tion on LessWrong’s Open Ques­tions re­search plat­form.

Open Ques­tions was built to help ap­ply the LessWrong com­mu­nity’s ra­tio­nal­ity/​epistemic to hu­man­ity’s most im­por­tant prob­lems.