AlexU, could you re-phrase your comment to have more descriptive discussion of the consequences you want to avoid, or of the evidence that leads you to disagree with ciphergoth? Right now, your comment mostly reads as “I really really want to express how aligned I/we am with niceness/tolerance/etc., and how socially objectionable ciphergoth’s comment is.” If you can think through the reasons for your response, and include more testable descriptions per connotation, the conversation will probably head more useful places.
ETA: I have the same suggestion concerning your othertwo recent comments.
The site is about rationality, not dogma—I think. Posts should be judged on the strength and clarity of their ideas, not the beliefs of the individual posters who espouse them. To categorically exclude an entire class of people—some of whom are very good rationalists and thinkers—simply because they don’t subscribe to some LW party line, is not only short-sighted, but perversely, seems to run entirely counter the spirit of a site devoted to rationality.
The consequences, I imagine, would be less interesting, less broad discussion, with a constricting of perspective and a tendency to attract the same fairly narrow range of people who want to talk about the same fairly narrow range of topics. It will select not for good rationalists per se, but some mix of people who overly fancy themselves good rationalists, as well as the standard transhumanism/Singularity crowd that’s here because of EY.
Observe: Although this post has the same conclusion, because it has different arguments, it is voted up while similar-concluding different-argued comments by the same poster are voted down. (I agree with this judgment; this is how it is supposed to be.) Those wondering exactly what it takes to get voted up or voted down have here a good example before them.
“To categorically exclude an entire class of people—some of whom are very good rationalists and thinkers—”
But that’s the point. No one who belongs to that class is a good rationalist. I’m sure there are people who belong to that class who in limited contexts are good rationalists, but speaking globally, one cannot be a rationalist of any quality and exempt some assertion from the standards of rationality.
This isn’t about the perfect being the enemy of the good. It’s about minimum standards, consistency, and systematic honesty.
If you possess evidence that shows theism to be rationally justifiable, present it.
speaking globally, one cannot be a rationalist of any quality and exempt some assertion from the standards of rationality.
You can’t speak globally when it comes to the human brain.
Sure, if brains had any sort of global consistency or perfect internal software reuse, you could say that being a rationalist rules out believing in irrational things.
But as a practical matter, you can’t insist on consistency when someone might simply not have thought of applying the same logic to all their beliefs… especially since MOST of the beliefs we have are not perceptible as beliefs in the first place. (They just seem like “reality”.)
In addition, our brains are quite capable of believing in contradictory things at the same time, with one set controlling discourse and the other controlling behavior. In work with myself and others who have no conscious religious beliefs, I’ve often discovered, mid-mindhack, that there’s some sort of behavior in the person being driven by an unconscious desire to go to heaven or not be sent to hell. So even someone who thinks they’re an atheist can believe in silly things, without even knowing it.
So, IMO, it makes as much sense to ban people with supernatural beliefs, as it does to ban people who have idiotic beliefs about brains being consistent.
Actually, come to think of it, the belief that people’s brains must be consistent IS a supernatural belief, as there’s no physical mechanism in the brain that allows O(1) updating of belief structures that don’t share common components. To insist that the moment one becomes a rationalist, one must then become an atheist, is to insist on a miracle inconsistent with physics, biology, and information science.
So, if we are going to exclude people with inconsistent or supernatural beliefs, let’s start with the people who insist that the brain must be supernaturally consistent. (This is actually pretty reasonable, since such an error in thinking arises from the same mind-projection machinery that gives rise to theism, after all...)
I would expect the potential commentariat at Less Wrong to be terribly small if anyone holding a firm belief that is not rationally justifiable were banned.
I am highly skeptical that I have fully purged myself of all beliefs where I have been presented with correct damning evidence against them. If anything, reading here has raised my estimate of how many such beliefs I might hold. Even as I purge many false propositions, I become aware of more biases to which I am demonstrably subject. Can anyone here who is aware of the limitations of our mental hardware say otherwise?
I am not as convinced as most posters here that all possible versions of theism are utterly wrong and deserve to be accorded effectively zero probability, but in any case, it’s clear that LW (and OB) communities generally wish to consider the case against theism closed. To the extent that the posters do not attack theism or theists in an obviously biased way, I have respected the decision and post and vote accordingly, including downvoting people who try to reopen the argument in inappropriate places.
I also intend to make a habit of downvoting those who waste time denouncing theism in inappropriate contexts or for specious reasons having more to do with signaling tribal bonds than with bringing up any new substantive argument against theism.
I don’t recall who suggested that we need another canonical example of irrationality, but I agree wholeheartedly. In fact, I’d suggest we need a decent short list to rotate through, so that no one topic gets beaten up so consistently as to encourage an affective death spiral around it.
I would rather emphasize Raising the Sanity Waterline. If we bar theists outright, we miss the opportunity to discuss rationality with them in different contexts. We don’t get to learn what insights they might have when their biases are not in the way. We don’t get to teach them about all the biases using nonreligious examples, so that they might, on their own, figure out to check for those same biases in their theistic beliefs. If we allow theists, we still have the karma system to bury any obnoxious comments they make on discussions of religion, and the same karma system will encourage them to participate in the areas where they will get the most benifet.
Are you so confident in your perfect, unerring rationality that you’ll consider that particular proposition completely settled and beyond questioning? I’m about as certain that there is no God as one can get, but that certainty is still less than 100%, as it is for virtually all things I believe or know. Part of maintaining a rational outlook toward life, I’d think, would be keeping an attitude of lingering doubt about even your most cherished and long-held beliefs.
Yes, that will always be technically true—no belief can be assigned a probability of 100%. Nevertheless, my utility calculations recognize that the expected benefit of questioning my stance on that issue is so small (because of its infinitesimal probability) that almost anything else has a higher expected value.
Why then should I question that, when there is so much else to ask?
Why isn’t this comment voted higher? (I presume it is because it is relatively new.) This is exactly the kind of comment that makes it easier on new/shy people. This sort of feedback is phenomenal. It may be harsh but it (a) gives specific criticisms without being vindictive (b) offers a reinterpretation of the original post and (c) offers suggestions on how to proceed in the future.
I feel that AlexU’s response was a vast improvement and is evidence to the value of AnnaSalamon’s comment.
AlexU, could you re-phrase your comment to have more descriptive discussion of the consequences you want to avoid, or of the evidence that leads you to disagree with ciphergoth? Right now, your comment mostly reads as “I really really want to express how aligned I/we am with niceness/tolerance/etc., and how socially objectionable ciphergoth’s comment is.” If you can think through the reasons for your response, and include more testable descriptions per connotation, the conversation will probably head more useful places.
ETA: I have the same suggestion concerning your other two recent comments.
The site is about rationality, not dogma—I think. Posts should be judged on the strength and clarity of their ideas, not the beliefs of the individual posters who espouse them. To categorically exclude an entire class of people—some of whom are very good rationalists and thinkers—simply because they don’t subscribe to some LW party line, is not only short-sighted, but perversely, seems to run entirely counter the spirit of a site devoted to rationality.
The consequences, I imagine, would be less interesting, less broad discussion, with a constricting of perspective and a tendency to attract the same fairly narrow range of people who want to talk about the same fairly narrow range of topics. It will select not for good rationalists per se, but some mix of people who overly fancy themselves good rationalists, as well as the standard transhumanism/Singularity crowd that’s here because of EY.
Observe: Although this post has the same conclusion, because it has different arguments, it is voted up while similar-concluding different-argued comments by the same poster are voted down. (I agree with this judgment; this is how it is supposed to be.) Those wondering exactly what it takes to get voted up or voted down have here a good example before them.
“To categorically exclude an entire class of people—some of whom are very good rationalists and thinkers—”
But that’s the point. No one who belongs to that class is a good rationalist. I’m sure there are people who belong to that class who in limited contexts are good rationalists, but speaking globally, one cannot be a rationalist of any quality and exempt some assertion from the standards of rationality.
This isn’t about the perfect being the enemy of the good. It’s about minimum standards, consistency, and systematic honesty.
If you possess evidence that shows theism to be rationally justifiable, present it.
You can’t speak globally when it comes to the human brain.
Sure, if brains had any sort of global consistency or perfect internal software reuse, you could say that being a rationalist rules out believing in irrational things.
But as a practical matter, you can’t insist on consistency when someone might simply not have thought of applying the same logic to all their beliefs… especially since MOST of the beliefs we have are not perceptible as beliefs in the first place. (They just seem like “reality”.)
In addition, our brains are quite capable of believing in contradictory things at the same time, with one set controlling discourse and the other controlling behavior. In work with myself and others who have no conscious religious beliefs, I’ve often discovered, mid-mindhack, that there’s some sort of behavior in the person being driven by an unconscious desire to go to heaven or not be sent to hell. So even someone who thinks they’re an atheist can believe in silly things, without even knowing it.
So, IMO, it makes as much sense to ban people with supernatural beliefs, as it does to ban people who have idiotic beliefs about brains being consistent.
Actually, come to think of it, the belief that people’s brains must be consistent IS a supernatural belief, as there’s no physical mechanism in the brain that allows O(1) updating of belief structures that don’t share common components. To insist that the moment one becomes a rationalist, one must then become an atheist, is to insist on a miracle inconsistent with physics, biology, and information science.
So, if we are going to exclude people with inconsistent or supernatural beliefs, let’s start with the people who insist that the brain must be supernaturally consistent. (This is actually pretty reasonable, since such an error in thinking arises from the same mind-projection machinery that gives rise to theism, after all...)
I would expect the potential commentariat at Less Wrong to be terribly small if anyone holding a firm belief that is not rationally justifiable were banned.
I am highly skeptical that I have fully purged myself of all beliefs where I have been presented with correct damning evidence against them. If anything, reading here has raised my estimate of how many such beliefs I might hold. Even as I purge many false propositions, I become aware of more biases to which I am demonstrably subject. Can anyone here who is aware of the limitations of our mental hardware say otherwise?
I am not as convinced as most posters here that all possible versions of theism are utterly wrong and deserve to be accorded effectively zero probability, but in any case, it’s clear that LW (and OB) communities generally wish to consider the case against theism closed. To the extent that the posters do not attack theism or theists in an obviously biased way, I have respected the decision and post and vote accordingly, including downvoting people who try to reopen the argument in inappropriate places.
I also intend to make a habit of downvoting those who waste time denouncing theism in inappropriate contexts or for specious reasons having more to do with signaling tribal bonds than with bringing up any new substantive argument against theism.
I don’t recall who suggested that we need another canonical example of irrationality, but I agree wholeheartedly. In fact, I’d suggest we need a decent short list to rotate through, so that no one topic gets beaten up so consistently as to encourage an affective death spiral around it.
I would rather emphasize Raising the Sanity Waterline. If we bar theists outright, we miss the opportunity to discuss rationality with them in different contexts. We don’t get to learn what insights they might have when their biases are not in the way. We don’t get to teach them about all the biases using nonreligious examples, so that they might, on their own, figure out to check for those same biases in their theistic beliefs. If we allow theists, we still have the karma system to bury any obnoxious comments they make on discussions of religion, and the same karma system will encourage them to participate in the areas where they will get the most benifet.
Are you so confident in your perfect, unerring rationality that you’ll consider that particular proposition completely settled and beyond questioning? I’m about as certain that there is no God as one can get, but that certainty is still less than 100%, as it is for virtually all things I believe or know. Part of maintaining a rational outlook toward life, I’d think, would be keeping an attitude of lingering doubt about even your most cherished and long-held beliefs.
Yes, that will always be technically true—no belief can be assigned a probability of 100%. Nevertheless, my utility calculations recognize that the expected benefit of questioning my stance on that issue is so small (because of its infinitesimal probability) that almost anything else has a higher expected value.
Why then should I question that, when there is so much else to ask?
Where are you getting the idea that Annoyance said this?
Why isn’t this comment voted higher? (I presume it is because it is relatively new.) This is exactly the kind of comment that makes it easier on new/shy people. This sort of feedback is phenomenal. It may be harsh but it (a) gives specific criticisms without being vindictive (b) offers a reinterpretation of the original post and (c) offers suggestions on how to proceed in the future.
I feel that AlexU’s response was a vast improvement and is evidence to the value of AnnaSalamon’s comment.
Since you and I have voted it up, I guess two people have voted it down. That seems strange to me too.
Really must set up my LessWrong dev environment so I can add a patch to show both upvotes and downvotes!
Indeed. If that is the only change to this site’s system or ethic that comes out of this discussion, it will have been worth it.