By the way, I have a little bit disturbing feeling that too little of the newer material here is actually devoted to refining the art of human rationality
Part of that mission is to help people overcome the absurdity heuristic, and to help them think carefully about topics that normally trigger a knee-jerk reflex of dismissal on spurious grounds; it is in this sense that cryonics and the like are more than tangentially related to rationality.
I do agree with you that too much of the newer material keeps returning to those few habitual topics that are “superstimuli” for the heuristic. This perhaps prevents us from reaching out to newer people as effectively as we could. (Then again, as LW regulars we are biased in that we mostly look at what gets posted, when what may matter more for attracting and keeping new readers is what gets promoted.)
A site like YouAreNotSoSmart may be more effective in introducing these ideas to newcomers, to the extent that it mostly deals with run-of-the-mill topics. What makes LW valuable which YANSS lacks is constructive advice for becoming less wrong.
As for overcoming absurdity heuristics, more helpful would be to illustrate its inaproppriateness (is this a real word?) on thoughts which are seemingly absurd while having a lot of data proving them right, rather than predictions like Singularity which are mostly based on … just different heuristics.
Part of that mission is to help people overcome the absurdity heuristic, and to help them think carefully about topics that normally trigger a knee-jerk reflex of dismissal on spurious grounds; it is in this sense that cryonics and the like are more than tangentially related to rationality.
I do agree with you that too much of the newer material keeps returning to those few habitual topics that are “superstimuli” for the heuristic. This perhaps prevents us from reaching out to newer people as effectively as we could. (Then again, as LW regulars we are biased in that we mostly look at what gets posted, when what may matter more for attracting and keeping new readers is what gets promoted.)
A site like YouAreNotSoSmart may be more effective in introducing these ideas to newcomers, to the extent that it mostly deals with run-of-the-mill topics. What makes LW valuable which YANSS lacks is constructive advice for becoming less wrong.
Thanks for the link, I haven’t known YANSS.
As for overcoming absurdity heuristics, more helpful would be to illustrate its inaproppriateness (is this a real word?) on thoughts which are seemingly absurd while having a lot of data proving them right, rather than predictions like Singularity which are mostly based on … just different heuristics.