People on LW like to insist that there is a litmus test for rationality; certain things any rationalist believes or does not believe as a necessary condition of being rational. This post makes this pretty explicit (see ‘slam-dunks’).
However, I wish that the LW community would make more of a distinction between rational beliefs based on really good epistemological foundations (i.e., esoteric philosophical stuff) and rational beliefs that are rational because they actually matter—because they’re useful and help you win.
I’m someone who is interested in philosophy, but I still measure whether something is rational or not based on evidence of whether it works or not. Given my understanding of Less Wrong Rationality, I think my view of rationality is the more Less-Wrong-Rational view—what defines ‘rational’ should be evidence based, not based on ability to spin the nicest sounding argument.
Accordingly, I think that someone who denies medical treatment due to religious beliefs (and dies) is much less rational (on a completely different level) than the theist who cannot locate any material way that his beliefs compromise his ability to achieve his terminal values.
The same argument goes for many worlds—and it’s crazy that I might now be even charting into even more heretical ground. Instead of believing that many worlds is a slam-dunk, I have no belief about it and think many worlds is an esoteric non-issue—because nothing rests upon it. If I was fascinated by the question, I would read some papers about it, and might form an opinion or might continue to appreciate the aesthetic beauty of arguments on both sides.
However, as soon as the issue had a demonstrated material consequence—for example; some experiment that could be done or some money pump that could be designed, then I trust I would get to the bottom of it. Because, for one thing, that fact that it materially mattered would mean that there would be some kind of evidence either way.
I just realized that while this is my argument for why I don’t think theists are categorically irrational, it doesn’t mean that any of them would belong here. Less Wrong obviously values having an accurate map not just to the extent that it facilitates “winning”, but also for its own sake, because they value truth. So finally I would qualify that the argument against having theists here isn’t that they’re so necessarily irrational, but theism conflicts with the value of having an accurate map. Likewise, Less Wrong might value certain epistemological foundations, such as Occam’s razor (obviously) and any others that lead to choosing many worlds as the natural hypothesis.
I just forgot (while composing the message above) that ‘Less Wrong’ represents a combination of instrumental rationality AND VALUES. I usually think of these values as valuing human life, but these values include valuing epistemic rationality. While Less Wrong is much more tolerant of different values than wrong beliefs in general, its justifiably not so tolerant of different values about beliefs.
I think that my comment above should have been down-voted more than it was, since it’s not representing the community norm of valuing truth for its own sake. I’m not valuing truth so much these days because I’m going through a value-nihilistic phase -- that ironically, I blame on Less Wrong. But ‘you guys’ that care about truth might down-vote a comment arguing that there is no value to beliefs beyond their effectiveness in achieving goals.
It seems to me like you’re creating an artificial dichotomy between the value of truth itself and the material relevancy of truth. To me, these ideas are rather coupled together, and I would up-vote your first post for the same reason I would up-vote your second post.
In other words, to me, “valuing truth for its own sake” includes valuing truth for its importance, testability, relevance, etc. in other areas.
People on LW like to insist that there is a litmus test for rationality; certain things any rationalist believes or does not believe as a necessary condition of being rational. This post makes this pretty explicit (see ‘slam-dunks’).
However, I wish that the LW community would make more of a distinction between rational beliefs based on really good epistemological foundations (i.e., esoteric philosophical stuff) and rational beliefs that are rational because they actually matter—because they’re useful and help you win.
I’m someone who is interested in philosophy, but I still measure whether something is rational or not based on evidence of whether it works or not. Given my understanding of Less Wrong Rationality, I think my view of rationality is the more Less-Wrong-Rational view—what defines ‘rational’ should be evidence based, not based on ability to spin the nicest sounding argument.
Accordingly, I think that someone who denies medical treatment due to religious beliefs (and dies) is much less rational (on a completely different level) than the theist who cannot locate any material way that his beliefs compromise his ability to achieve his terminal values.
The same argument goes for many worlds—and it’s crazy that I might now be even charting into even more heretical ground. Instead of believing that many worlds is a slam-dunk, I have no belief about it and think many worlds is an esoteric non-issue—because nothing rests upon it. If I was fascinated by the question, I would read some papers about it, and might form an opinion or might continue to appreciate the aesthetic beauty of arguments on both sides.
However, as soon as the issue had a demonstrated material consequence—for example; some experiment that could be done or some money pump that could be designed, then I trust I would get to the bottom of it. Because, for one thing, that fact that it materially mattered would mean that there would be some kind of evidence either way.
I just realized that while this is my argument for why I don’t think theists are categorically irrational, it doesn’t mean that any of them would belong here. Less Wrong obviously values having an accurate map not just to the extent that it facilitates “winning”, but also for its own sake, because they value truth. So finally I would qualify that the argument against having theists here isn’t that they’re so necessarily irrational, but theism conflicts with the value of having an accurate map. Likewise, Less Wrong might value certain epistemological foundations, such as Occam’s razor (obviously) and any others that lead to choosing many worlds as the natural hypothesis.
I just forgot (while composing the message above) that ‘Less Wrong’ represents a combination of instrumental rationality AND VALUES. I usually think of these values as valuing human life, but these values include valuing epistemic rationality. While Less Wrong is much more tolerant of different values than wrong beliefs in general, its justifiably not so tolerant of different values about beliefs.
I think that my comment above should have been down-voted more than it was, since it’s not representing the community norm of valuing truth for its own sake. I’m not valuing truth so much these days because I’m going through a value-nihilistic phase -- that ironically, I blame on Less Wrong. But ‘you guys’ that care about truth might down-vote a comment arguing that there is no value to beliefs beyond their effectiveness in achieving goals.
It seems to me like you’re creating an artificial dichotomy between the value of truth itself and the material relevancy of truth. To me, these ideas are rather coupled together, and I would up-vote your first post for the same reason I would up-vote your second post.
In other words, to me, “valuing truth for its own sake” includes valuing truth for its importance, testability, relevance, etc. in other areas.