First of all, thank you so much for posting this. I’ve been contemplating composing a similar post for a while now but haven’t because I did not feel like my experience was sufficiently extensive or my understanding was sufficiently deep. I eagerly anticipate future posts.
That said, I’m a bit puzzled by your framing of this domain as “arational.” Rationality, at least as LW has been using the word, refers to the art of obtaining true beliefs and making good decisions, not following any particular method. Your attitude and behavior with regard to your “mystical” experiences seems far more rational than both the hasty enthusiasm and reflexive dismissal that are more common. Most of what my brain does might as well be magic to me. The suggestion that ideas spoken to you by glowing spirit animals should be evaluated in much the same way as ideas that arise in less fantastic (though often no less mysterious) ways seems quite plausible and worthy of investigation. You seem to have done a good job at keeping your eye on the ball by focusing on the usefulness of these experiences without accepting poorly thought out explanations of their origins.
It may be the case that we have the normative, mathematical description of what rationality looks like down really well, but that doesn’t mean we have a good handle on how best to approximate this using a human brain. My guess is that we’ve only scratched the surface. Peak or “mystical” experiences, much like AI and meta-ethics, seem to be a domain in which human reasoning fails more reliably than average. Applying the techniques of X-Rationality to this domain with the assumption that all of reality can be understood and integrated into a coherent model seems like a fun and potentially lucrative endeavor.
So now, in traditional LW style, I shall begin my own contribution with a quibble and then share some related thoughts:
Many of them come from spiritual, religious or occult sources, and it can be a little tricky to tease apart the techniques from the metaphysical beliefs (the best case, perhaps, is the Buddhist system, which holds (roughly) that the unenlightened mind can’t truly understand reality anyway, so you’d best just shut up and meditate).
As far as I understand it, the Buddhist claim is that the unenlightened mind fails to understand the nature of one particular aspect of reality: it’s own experience of the world and relationship to it. One important goal of what is typically called “insight meditation” seems to be to cause people to really grok that the map is not the territory when it comes to the category of “self.” What follows is my own, very tentative, model of “enlightenment”:
By striving to dissect your momentary experience in greater and greater detail, the process by which certain experiences are labeled “self” and others “not-self” becomes apparent. It also becomes apparent that the creation of this sense of a separate self is at least partially responsible for the rejection of or “flinching away” from certain aspects of your sensory experience and that this is one of the primary causes of suffering (which seems to be something like “mental conflict”). My understanding of “enlightenment” is as the final elimination (rather than just suppression) of this tendency to “shoot the messenger.” This possibility is extremely intriguing to me because it seems like it should eliminate not only suffering but what might be the single most important source of “wireheading” behaviors in humans. People who claim to have achieved it say it’s about as difficult as getting an M.D. Seems worthy of investigation.
Rather than go on and on here, I think it’s about time I organized my experience and research into a top-level post.
I’m a newly registered member of LW (long-time lurker) and was thinking of posting about this very topic. Like many in the community, I have a background in science / math / philosophy, but unlike many, I have also spent many years working to understand what Jasen calls the “Buddhist claim” experientially (i.e. through meditation) and being involved with the contemporary traditions that emphasize attaining that understanding. I see myself as an “insider” straddling both communities, well-situated to talk about what Buddhists are going on about regarding “self” and “not-self” and enlightenment in a way that would be highly comprehensible to people who frame the world in a contemporary scientific way.
Specifically, I was considering a three-part series along these lines:
1) Highly abridged history of Buddhist thought concerning “insight” meditation and the insight into “no-self”; overview of contemporary secular traditions focusing on attaining this insight. Risks and benefits of pursuing it.
2) Case study: Have 1500 years of Buddhist tradition yielded a novel testable model of an aspect of human psychological development?
3) How science has dropped the ball concerning research on meditation and “spirituality”; how some communities of meditators have come to know more about meditation than scientists do; some thoughts on why; some thoughts on how this could be changed.
However, I don’t want to pre-empt anyone’s post (in particular Jasen’s, since he mentioned it), and also, I don’t know the extent to which this is an interesting topic to LW-ers, or what the community norms are for newly-registered members initiating new posts. So I’d like to get some sense of whether people here would like to see posts on this topic, and in particular, what Jasen thinks about the prospect of me posting.
You are allowed to write a top-level article once you have at least twenty karma. You should write a top-level article if you have twenty karma, you have an important point to make about rationality, and you’re familiar enough with the sequences that you don’t think you’re making a simple mistake.
These are some interesting points. I meant “arational” in the sense that our actions are arational—rationally motivated, perhaps, but it would be incorrect to say that the action itself is either rational or irrational, hence it’s arational. What intrigues me is the fact that these arational phenomena are deeply embedded in the way our minds are structured, and therefore can perhaps inform and augment the process of rationality. Indeed, some of them may be extremal states of the same systems that allow us to be rational in the first place.
I’d definitely like to see this post on Buddhism; you seem to have an excellent grasp on it.
I agree with Jasen. I don’t think Skatche’s (fascinating) story is an account of much arationality. Rationality is about having true beliefs and achieving your goals, not about acting like Spock.
First of all, thank you so much for posting this. I’ve been contemplating composing a similar post for a while now but haven’t because I did not feel like my experience was sufficiently extensive or my understanding was sufficiently deep. I eagerly anticipate future posts.
That said, I’m a bit puzzled by your framing of this domain as “arational.” Rationality, at least as LW has been using the word, refers to the art of obtaining true beliefs and making good decisions, not following any particular method. Your attitude and behavior with regard to your “mystical” experiences seems far more rational than both the hasty enthusiasm and reflexive dismissal that are more common. Most of what my brain does might as well be magic to me. The suggestion that ideas spoken to you by glowing spirit animals should be evaluated in much the same way as ideas that arise in less fantastic (though often no less mysterious) ways seems quite plausible and worthy of investigation. You seem to have done a good job at keeping your eye on the ball by focusing on the usefulness of these experiences without accepting poorly thought out explanations of their origins.
It may be the case that we have the normative, mathematical description of what rationality looks like down really well, but that doesn’t mean we have a good handle on how best to approximate this using a human brain. My guess is that we’ve only scratched the surface. Peak or “mystical” experiences, much like AI and meta-ethics, seem to be a domain in which human reasoning fails more reliably than average. Applying the techniques of X-Rationality to this domain with the assumption that all of reality can be understood and integrated into a coherent model seems like a fun and potentially lucrative endeavor.
So now, in traditional LW style, I shall begin my own contribution with a quibble and then share some related thoughts:
As far as I understand it, the Buddhist claim is that the unenlightened mind fails to understand the nature of one particular aspect of reality: it’s own experience of the world and relationship to it. One important goal of what is typically called “insight meditation” seems to be to cause people to really grok that the map is not the territory when it comes to the category of “self.” What follows is my own, very tentative, model of “enlightenment”:
By striving to dissect your momentary experience in greater and greater detail, the process by which certain experiences are labeled “self” and others “not-self” becomes apparent. It also becomes apparent that the creation of this sense of a separate self is at least partially responsible for the rejection of or “flinching away” from certain aspects of your sensory experience and that this is one of the primary causes of suffering (which seems to be something like “mental conflict”). My understanding of “enlightenment” is as the final elimination (rather than just suppression) of this tendency to “shoot the messenger.” This possibility is extremely intriguing to me because it seems like it should eliminate not only suffering but what might be the single most important source of “wireheading” behaviors in humans. People who claim to have achieved it say it’s about as difficult as getting an M.D. Seems worthy of investigation.
Rather than go on and on here, I think it’s about time I organized my experience and research into a top-level post.
I’m a newly registered member of LW (long-time lurker) and was thinking of posting about this very topic. Like many in the community, I have a background in science / math / philosophy, but unlike many, I have also spent many years working to understand what Jasen calls the “Buddhist claim” experientially (i.e. through meditation) and being involved with the contemporary traditions that emphasize attaining that understanding. I see myself as an “insider” straddling both communities, well-situated to talk about what Buddhists are going on about regarding “self” and “not-self” and enlightenment in a way that would be highly comprehensible to people who frame the world in a contemporary scientific way.
Specifically, I was considering a three-part series along these lines:
1) Highly abridged history of Buddhist thought concerning “insight” meditation and the insight into “no-self”; overview of contemporary secular traditions focusing on attaining this insight. Risks and benefits of pursuing it.
2) Case study: Have 1500 years of Buddhist tradition yielded a novel testable model of an aspect of human psychological development?
3) How science has dropped the ball concerning research on meditation and “spirituality”; how some communities of meditators have come to know more about meditation than scientists do; some thoughts on why; some thoughts on how this could be changed.
However, I don’t want to pre-empt anyone’s post (in particular Jasen’s, since he mentioned it), and also, I don’t know the extent to which this is an interesting topic to LW-ers, or what the community norms are for newly-registered members initiating new posts. So I’d like to get some sense of whether people here would like to see posts on this topic, and in particular, what Jasen thinks about the prospect of me posting.
Welcome!
Please do this. I’m interested.
From the FAQ:
Thanks for pointing the rule about karma out to me. I’ve got 10 points so far, 10 more to go...
Awesome, I’m very interested in sharing notes, particular since you’ve been practicing meditation a lot longer than I have.
I’d love to chat with you on Skype if you have the time. Feel free to send me an email at jasen@intelligence.org if you’d like to schedule a time.
Great, I’ll send you an email in a day or two (things are rather busy on my end, apologies) and we’ll work something out!
I’m very interested too!
These are some interesting points. I meant “arational” in the sense that our actions are arational—rationally motivated, perhaps, but it would be incorrect to say that the action itself is either rational or irrational, hence it’s arational. What intrigues me is the fact that these arational phenomena are deeply embedded in the way our minds are structured, and therefore can perhaps inform and augment the process of rationality. Indeed, some of them may be extremal states of the same systems that allow us to be rational in the first place.
I’d definitely like to see this post on Buddhism; you seem to have an excellent grasp on it.
I agree with Jasen. I don’t think Skatche’s (fascinating) story is an account of much arationality. Rationality is about having true beliefs and achieving your goals, not about acting like Spock.