This is a general point concerning Robin’s and Eliezer’s disagreement. I’m posting it in this thread because this thread is the best combination of relevance and recentness.
It looks like Robin doens’t want to engage with simple logical arguments if they fall outside of established, scientific frameworks of abstractions. Those arguments could even be damning critiques of (hidden assumptions in) those abstractions. If Eliezer were right, how could Robin come to know that?
It’s impossible for me to imagine a tiered system that wouldn’t degenerate into a status competition. Can you think of examples of one that hasn’t?
The main blog and the community site mustn’t be seen as different tiers of the same thing, instead they must be seen as serving different purposes.
I think that the community should emphasize that it is a community of wannabe-rationalists whose goal is to self-improve, overcome bias and learn the secrets of the Bayes or whatever. Just as this blog does. I think that that would weed out the likes of the Rational Response Squad, which is an example of what we don’t want a rationalist community to be like!
At the same time I don’t think that a rationalist community should be any more elitist than necessary. It should fulfill part of the same purpose as your planned popular book.
If you are constantly surprised by solutions that are high in you preference ordering but low in your search ordering, that is a problem with your search ordering. If your search ordering is correct, creativity is useless.
The concept of different epistemological magisteria. E gave an example of it in this post (and also in the post about scientists outside the laboratory), but his example is just the tip of the iceberg. This failure of rationality doesn’t manifest itself explicitly most of the time, but is engaged in implicitly by almost everybody that I know that isn’t into hardcore rationality.
It’s definitely engaged in by people who are into, or at least cheer for, science and (traditional) rationality and/or philosophy. It’s the double standard between what epistemological standards you explicitly endorse, and what are the actual beliefs on the basis of which you act. Acting as if the sun will rise tomorrow even though you endorse radical scepticism, accepting what Richard Dawkins says on his authority while seeking out refutations for creationist arguments. I think one big reason for this is that people who are interested in this sort of thing are exposed too much to deductive reasoning and hardly at all to rigorous inductive reasoning. Inductive reasoning is the practical form of reasoning that actually works in the real world (many fallacies of deductive reasoning are actually valid probabilistic inferences), and we all have to engage in it explicitly or implicitly to cope in the world. But having been exposed only the “way” of deductive rationality, and warned against it’s fallacies, people may come to experience a cognitive dissonance between what epistemological techniques are useful in real life and which epistemological techniques they ought to be using—and therefore to see science, rationality and philosophy as disconnected from real life, things to be cheered for and entertaning diversions. Such people don’t hold every part of their epistemological self under the same level of scrutiny, because implicitly they believe that their methods of scrutinizing are imperfect. I recognize my past self in this, but not my present self, who knows about evo psych, inductive reasoning etc. and has seen that these methods actually work and can therefore criticize his own epistemological habits using the full force of his own rationality...
This might concern mistaken, well-meaning people more than the actual Dark Side but it seems to me to be an important point anyway.