I’m curious why the section on “Applying Rationality” in the About page you cited doesn’t feel like an answer.
Applying Rationality
You might value Rationality for its own sake, however, many people want to be better reasoners so they can have more accurate beliefs about topics they care about, and make better decisions.
Using LessWrong-style reasoning, contributors to LessWrong have written essays on an immense variety of topics on LessWrong, each time approaching the topic with a desire to know what’s actually true (not just what’s convenient or pleasant to believe), being deliberate about processing the evidence, and avoiding common pitfalls of human reason.
Beyond that, The Twelve Virtues of Rationality includes “scholarship” as the 11th virtue, and I think that’s a deep part of LessWrong’s culture and aims:
The eleventh virtue is scholarship. Study many sciences and absorb their power as your own. Each field that you consume makes you larger. If you swallow enough sciences the gaps between them will diminish and your knowledge will become a unified whole. If you are gluttonous you will become vaster than mountains. It is especially important to eat math and science which impinge upon rationality: evolutionary psychology, heuristics and biases, social psychology, probability theory, decision theory. But these cannot be the only fields you study. The Art must have a purpose other than itself, or it collapses into infinite recursion.
I would think it strange though if one could get better about reasoning and believing true things without actually trying to do that on specific cases. Maybe you could sketch out what you expect LW content to look like more.
Thank you for your response. On reflection, I realize my original question was unclear. At its core is an intuition about the limits of critical thinking for the average person. If this intuition is valid, I believe some members of the community should, rationally, behave differently. While this kind of perspective doesn’tseemuncommon, I feel its implications may not be fully considered. I also didn’t realize how much this intuition influenced my thinking when writing the question. My thoughts on this are still unclear, and I remain uncertain about some of the underlying assumptions, so I won’t argue for it here.
Apologies for the confusion. I no longer endorse my question.
I’m curious why the section on “Applying Rationality” in the About page you cited doesn’t feel like an answer.
Beyond that, The Twelve Virtues of Rationality includes “scholarship” as the 11th virtue, and I think that’s a deep part of LessWrong’s culture and aims:
I would think it strange though if one could get better about reasoning and believing true things without actually trying to do that on specific cases. Maybe you could sketch out what you expect LW content to look like more.
Thank you for your response. On reflection, I realize my original question was unclear. At its core is an intuition about the limits of critical thinking for the average person. If this intuition is valid, I believe some members of the community should, rationally, behave differently. While this kind of perspective doesn’t seem uncommon, I feel its implications may not be fully considered. I also didn’t realize how much this intuition influenced my thinking when writing the question. My thoughts on this are still unclear, and I remain uncertain about some of the underlying assumptions, so I won’t argue for it here.
Apologies for the confusion. I no longer endorse my question.