Further comments, which I’m making in the safe haven of this topic rather than the wilds of the rest of LW:
I’m moderately sympathetic to all the cryonics / singularity stuff that’s often talked about here, but also suspicious. I haven’t come up with a properly argued response, (or even read all the very long posts about it!), but LW in general gives me a feeling of twisting things to fit already chosen conclusions on these topics.
Cryonics: I view it as a long-shot option with a possible big payoff. The part I have my doubts about is the feeling I get that it’s seen as a particularly good long-shot that’s important to focus on.
Singularity stuff: This has all very possibly been discussed at length in a long post I haven’t read, and I’m quite happy to get references. Two areas of this make me uncomfortable:
For me a key problem seems to be the rate at which people can adapt to new technologies. I’m sure I’ve seen this raised either in Marooned in Realtime (http://en.wikipedia.org/wiki/Marooned_in_Realtime) or in very standard commentary on it, so I’m sure this has been addressed somewhere. This seems likely to me to stop acceleration in technology once we reach the stage of significant change within a human lifetime.
Someone still has to do all the thinking. Assuming the singularity happens, and as yet undefined entities can solve major problems in short timespans, this will be because they are thinking very fast. They will be operating on a much faster time scale and to them, the apparent rate of progress won’t be much greater. The singularity will only appear to solve all our problems by handwaving from the point of view of the un-accelerated. Which around here seems to be viewed as an unpleasant state of existence, to be escaped as soon as the technology is available.
I think it would be possible to dump the mystical elements of Buddhism, and combine the rest with Bayesianism. I could see the ideal of optimal enlightenment.
I see some very promising trends in some of the Western Zen stuff, eg Brad Warner ( http://hardcorezen.blogspot.com/)(before anyone says it, I also see big problems with him!)
There’s a lot of dumping of mysticism, and some of the more unfortunate bits like gods and reincarnation.
And there are Buddha quotes like:
“Be lamps unto yourselves.
Be refuges unto yourselves.
Take yourself no external refuge.
Hold fast to the truth as a lamp.
Hold fast to the truth as a refuge. ”
Which I think is very compatible with rationalism.
And a lot of Buddhism seems to me to make nice testable claims “do these things and you will experience a greater frequency of desirable mental states”, for example.
However there’s also other stuff I’m somewhat sympathetic to, but have doubts about, which seem to suggest giving up on rational thought.
Hello!
I think I may have posted on a welcome thread before, but I still consider myself pretty new so saying hi again.
I’ve long thought rational thought is underrated. I find LW very interesting but quite difficult to get into.
Things I’d like to see:
Better introductory content.
Things I find particularly interesting:
Discussion of akrasia and strategies for avoiding it.
Buddhism—is it compatible with rationality? Personally I think some aspects yes, some aspects no.
Further comments, which I’m making in the safe haven of this topic rather than the wilds of the rest of LW:
I’m moderately sympathetic to all the cryonics / singularity stuff that’s often talked about here, but also suspicious. I haven’t come up with a properly argued response, (or even read all the very long posts about it!), but LW in general gives me a feeling of twisting things to fit already chosen conclusions on these topics.
Cryonics: I view it as a long-shot option with a possible big payoff. The part I have my doubts about is the feeling I get that it’s seen as a particularly good long-shot that’s important to focus on.
Singularity stuff: This has all very possibly been discussed at length in a long post I haven’t read, and I’m quite happy to get references. Two areas of this make me uncomfortable:
For me a key problem seems to be the rate at which people can adapt to new technologies. I’m sure I’ve seen this raised either in Marooned in Realtime (http://en.wikipedia.org/wiki/Marooned_in_Realtime) or in very standard commentary on it, so I’m sure this has been addressed somewhere. This seems likely to me to stop acceleration in technology once we reach the stage of significant change within a human lifetime.
Someone still has to do all the thinking. Assuming the singularity happens, and as yet undefined entities can solve major problems in short timespans, this will be because they are thinking very fast. They will be operating on a much faster time scale and to them, the apparent rate of progress won’t be much greater. The singularity will only appear to solve all our problems by handwaving from the point of view of the un-accelerated. Which around here seems to be viewed as an unpleasant state of existence, to be escaped as soon as the technology is available.
I think it would be possible to dump the mystical elements of Buddhism, and combine the rest with Bayesianism. I could see the ideal of optimal enlightenment.
I see some very promising trends in some of the Western Zen stuff, eg Brad Warner ( http://hardcorezen.blogspot.com/)(before anyone says it, I also see big problems with him!)
There’s a lot of dumping of mysticism, and some of the more unfortunate bits like gods and reincarnation.
And there are Buddha quotes like:
“Be lamps unto yourselves. Be refuges unto yourselves. Take yourself no external refuge. Hold fast to the truth as a lamp. Hold fast to the truth as a refuge. ”
(intermediate source http://www.sapphyr.net/buddhist/buddhist-quotes.htm, I’m pretty sure there are primary sources but too lazy to dig them up)
Which I think is very compatible with rationalism.
And a lot of Buddhism seems to me to make nice testable claims “do these things and you will experience a greater frequency of desirable mental states”, for example.
However there’s also other stuff I’m somewhat sympathetic to, but have doubts about, which seem to suggest giving up on rational thought.