I recently realized that I don’t remember seeing any LW posts questioning if it’s ever rational to give up on getting better at rationality, or at least on one aspect of rationality that a person is just having too much trouble with.
There have been posts questioning the value of x-rationality, and posts examining the possibility of deliberately being irrational, but I don’t remember seeing any posts examining if it’s ever best to just give up and stop trying to learn a particular skill of rationality.
For example, someone who is extremely risk-averse, and experiences severe psychological discomfort in situations involving risk, and who has spent years trying to overcome this problem with no success. Should this person keep trying to overcome the risk aversion, or just give up and never leave their comfort zone, focusing instead on strategies for avoiding situations involving risk?
yes, the “someone” I mention above is myself.
and yes, I am asking this hoping that the answer gives me an excuse to be lazy.
I’m surprised that noone gave the obvious answer yet, which is:
If overcoming the problem really is hopeless, then give up and focus on more productive things, otherwise keep trying.
If it isn’t obvious whether it’s hopeless or not, then do a more detailed cost/benefit analysis.
Still, I don’t remember seeing any LW post that even mentioned that sometimes giving up is an acceptable option. Or maybe I just forgot, or didn’t notice.
This is random and for all sorts of reasons possibly a bad idea- but have you ever thought about anti-anxiety medication? It might have side effects that turn you off of it but it could help you deal with high risk situations.
(I should disclaim: I’m not a doctor, my knowledge doesn’t extend past personal experience and a cog sci minor. Obviously, not medical advice, etc.)
I personally didn’t suggest it because it seemed like it’s obvious to you, so the only interesting response would be to deny it for some good reason.
I would note that you shouldn’t give up permanently. Maybe wait a year or a few, then see if you’ve grown in other ways that would make a new attempt more fruitful.
It’s been hinted at a few times, usually in terms of “how to pick goals” rather than “when to give up on goals”. AFAIK, never a top-level post of “maybe you should give up and do something easier and/or more productive toward other goals”. I think it’d be valuable.
I was hoping this would get more of a response—Peer and I have spent a considerable bit of time talking about this, and it’s gotten to the point where other perspectives would be useful.
My opinion is that it is, at a minimum, appropriate for someone in Peer’s situation to accept the fact that they are nearly guaranteed to be overwhelmed by emotion, to the point of becoming dangerously irrational, in certain situations, and to take that fact into account in deciding what problems to try to tackle. And, I see it as irrational to feel guilty or panicky about not being able to do more.
Part of the problem, though, is that the risky situations Peer mentioned are SIAI-related, and he seems to see doing anything less than his theoretical best (without taking psychological issues into account) in that context as not just lazy but immoral in some sense.
Peer’s comment is too vague and general for any meaningful response, and your comment doesn’t add clarity (“Risky situations Peer mentioned are SIAI-related”?).
“Risk aversion”? In one interpretation it’s a perfectly valid aspect of preference, not something that needs overcoming. For example, one can value sure-thing $1000 more than 11% probability at $10000.
I’m trying not to say anything here that’s more Peer’s business than mine, so I don’t want to use real examples, and I’m not certain enough that I know the details of what’s going on in Peer’s head to make up examples, but it doesn’t appear to be risk-aversion by that definition that’s the problem. It’s that when he’s in what appears to him to be a high-stakes situation (and ‘what appears to him to be’ is very relevant there—this isn’t a calculated response as far as I can tell, and being told by, for example, Michael Vassar that the risk in some situation is worth the reward is nearly useless), he panics, and winds up doing things that make the issue worse in some way—usually in the form of wasting a lot of energy by going around in circles and then eventually backing out of dealing with the situation at all.
Sorry, but I still haven’t thought of a good example that wouldn’t take too long to explain.
Another topic that Ade and I have been discussing is the difference between my idealized utility function (in which a major component is “maximize the probability that the Singularity turns out okay”), and whatever it is that actually controls my decisions (in which a major component is “avoid situations where my actions have a significant probability of making things worse”)
(I think there was at least one LW post on the topic of the difference between these two utility functions, but I didn’t find them after a quick search.)
So to answer Vladimir’s question, in my idealized utility function, certainty is not inherently valuable, and I know that when faced with a choice between certainty and uncertainty, I should shut up and multiply. However, my actual utility function has a paralyzing inability to deal with uncertainty.
Other relevant details are:
*severe underconfidence
*lack of experience, common sense, and general sanity
*fear of responsibility
*an inability to deal with (what appear to be) high-stakes situations. A risk of losing $1000 is already enough to qualify as “paralyzingly high stakes”.
an inability to deal with (what appear to be) high-stakes situations. A risk of losing $1000 is already enough to qualify as “paralyzingly high stakes”.
You know, physiologically, fear and excitement are very similar. My Psychology 101 textbook mentioned an experiment in which experimental subjects who met a young woman in a situation where the environment was scary (a narrow bridge over a deep chasm) reported her as being more attractive than subjects who met her in a neutral setting. Many people are afraid of public speaking or otherwise performing before an audience. I’m something of an exception, because I find it exciting instead of scary. Maybe some practice at turning fear into excitement could help? I don’t know exactly how to do that, but you could try watching scary movies, or riding roller coasters, or playing games competitively, or something like that.
Also, perhaps another possible way to deal is to not care as much about the outcome? Always look on the bright side of life, and all that. Maybe I’ve just read too much fiction and played too many video games, but it seems like things usually do tend to work out okay. After all, humanity did survive the Cold War without blowing itself up. I don’t know how to do this, but if you think you could try to take a more abstract and less personal perspective on whatever is scaring you, it might help.
Well, one way I do nothing is by reading LessWrong and other blogs, and posting comments. I tend to be hesitant to give authoritative advice about dealing with personal issues, as I’m probably more screwed up than average, but I can still make suggestions. I find it hard to imagine myself as a counselor of any kind, though.
As for “better than video games”, sometimes yes, sometimes no. It depends a lot on the particular video game.
Well, one way I do nothing is by reading LessWrong and other blogs, and posting comments.
I feel it’s a curiosity stopper to think of browsing the Internet as “doing nothing”. You learn, you communicate, you help, you signal your expertise. Find better understanding of the gist of your motivation and turn it into a sustainable plan for driving your day-to-day activity (in particular for making some money).
It’s not so much “doing nothing” as “something I do for no other reason than it’s become part of my standard routine”. I think I’ve become very much driven by habit; I have a tendency to keep playing a video game even after I’ve decided I don’t like it very much and have plenty of others I could be playing.
To quote a friend of mine, ‘it’s pointless to doubt yourself. It only reduces what you can do.’
My meta-suggestion is to find things that you enjoy or care about (not the same thing) enough to put effort into handling them them better. Giving advice in general doesn’t seem to fall into that category—I don’t remember seeing you do it regularly, which is the only measure I really have access to—but you seemed pretty engaged in this case, so there may be an aspect of this situation that you care about more than you would care about a run-of-the-mill situation. If there is, and if you can figure out what it is, you can use that information to find more things of that type, which is likely to be useful—you run into that ‘having something to protect’ effect.
Well given that I don’t know what you’ve actually tried its hard to say if I think you’ve exausted your options (though it sounds like this sort of think might be best served by professional therapy). But sure, if the situations is really that bleak (assuming you have outside confirmation of this) then yeah give it up. Work on something else. Does your psychological discomfort come with any risk? Or just when particular kinds of things are at risk?
Relatedly, has anyone thought about the relationship between rationality and psychotherapy? It just occurred to me that there might be a lot there.
Relatedly, has anyone thought about the relationship between rationality and psychotherapy? It just occurred to me that there might be a lot there.
Huh? You mean, like, psychotherapists are unusually irrational people? Or maybe that no rationalist would give any significant credence to any of the clinical psychology theory? Or maybe that a good rationalist will rarely need psychotherapy because their deduction skills are much better than most therapists? Please explain.
To be less snide, I find it quite unlikely that therapy would help PI significantly. (Of course, I know little of his/her specific circumstances.) I think a more fruitful course of action, if PI does want to overcome the problem*, would be to keep trying to overcome it directly, and meanwhile continue to form new free relationships with a variety of trusted people and see if they can help at all by providing insight or emotional support. Social networks are better than yellow book pages at finding people with relevant insights. And good friends are better than good therapists at emotional support.
It is possible that therapy isn’t usually cost effective but I don’t know of any study which suggests the therapist market is uniquely distorted. People pay a lot of money for a good therapists and therapists build their practice by way of references. I don’t think I have to endorse Freudian psychoanalysis in order to think that talking to an experienced stranger about your problems might be helpful in ways that talking to friends wouldn’t be. I don’t know the details of peer’s problem (and sorry peer, for hijacking this) but his risk aversion might extend to fear of losing social capital and being embarassed. If thats the case telling him to go make more friends and tell them about his problems seems to miss the point.
What I meant by a relationship between rationality and psychotherapy is that therapy often involves getting people to be happier by having them behave more rationally. It seems to me that that some of the methods and ideas discussed and used here could bear on therapeutic practice. Frankly, better than talking to friends for free (therapy from people you have other relationships is always going to be more complicated since there are all sorts of signaling and status issues that will get in the way of an honest dialog) would be talking to rationalist strangers for free. I imagine the Bayesian cult leaders of Eliezer’s fiction could charge a nice fee for talking to people and helping them make life decisions free from bias and overcome akrasia. We’ve all recognized that a lot of the material that gets discussed here looks like less useless self-help. To me, that means that this material might also be less useless other-help.
I sort of doubt it- but it would be great if to know if there are any practicing therapists or social workers that read less wrong.
but his risk aversion might extend to fear of losing social capital and being embarassed. If thats the case telling him to go make more friends and tell them about his problems seems to miss the point.
Certainly. I didn’t get the impression that that was the case from his comment, but perhaps it is.
therapy often involves getting people to be happier by having them behave more rationally.
My main beef with therapy is that it’s ineffective at this. (Not in all cases, but more likely in the case of LW members.) It’s certainly a noble goal.
I don’t think I have to endorse Freudian psychoanalysis in order to think that talking to an experienced stranger about your problems might be helpful in ways that talking to friends wouldn’t be.
I think you’re saying here that you don’t have to endorse any particular methodology in order to think etc. I agree with the conditional, but I somewhat disagree with the consequent.
My main beef with therapy is that it’s ineffective at this. (Not in all cases, but more likely in the case of LW members.) It’s certainly a noble goal.
I more or less agree with this. I was smarter than my therapist too but it was still helpful for three reasons. First, it forced me to recite my motives, reasons and feelings out loud which made me more conscious of them so that I could actually analyze and evaluate them. Second, the questions she asked prompted new thoughts that I wouldn’t have had. Even if the premise of her questions was silly (she wasn’t a Freudian but had a tendency to bring up my mother at inopportune times) it still brought forth helpful thoughts. Third, while she was behind me in IQ she had enough experience and knowledge of patterns of behavior to call me on my bullshit. In my experience (and as I understand it, in studies) intelligent people are especially good at rationalizing away behavior and channeling emotional reactions in weird, unhelpful directions.
Anyway thats what I got out of it. Eventually I think I reached a point of diminishing returns on it (once I could recognize patterns in my behavior paying money to have someone else do it did seem useless). I still have a problem of putting my conclusions about my own unhealthy, irrational behavior to good use, but that doesn’t seem like the kind of thing anyone will be able to help me with.
You’re definitely right that therapy is overall too ineffective- which is why I think it could benefit from the insights of this site. I actually think I could get a fair amount out of therapy with an extreme rationalist- and reading your blog it seems like your problem with therapists is that they’re not enough like your average less wrong poster.
Hmm. Maybe I was born unusually introspective, because my therapists never deepened my analysis or called me on bullshit. My experience may be more atypical than I thought.
In my experience (and as I understand it, in studies) intelligent people are especially good at rationalizing away behavior and channeling emotional reactions in weird, unhelpful directions.
I haven’t heard of those studies. I’d be interested in any references you have. I’m familiar with the correlation between intelligence and kookiness, but this sounds a bit different, though probably related.
your problem with therapists is that they’re not enough like your average less wrong poster.
Heh. Well, sort of. That and, maybe, that I’m just not cut out for therapy.
This doesn’t look like a hijack to me. I haven’t suggested therapy to Peer, probably because I’m pretty strongly biased against doing so, but now that I think about it, it may be useful to at least consider it.
I agree. There are things that are part of you, but that you pretty much have to treat as external facts. Some of those are qualities of your utility function, such as risk aversion. I would not even try to change those.
Others are about abilities, like emotional beviour, or akrasia of various kinds. Those you can try to change, but sometimes that is not possible, or would cost more than it is worth, and then you just accept them and concentrate on other things.
I recently realized that I don’t remember seeing any LW posts questioning if it’s ever rational to give up on getting better at rationality, or at least on one aspect of rationality that a person is just having too much trouble with.
There have been posts questioning the value of x-rationality, and posts examining the possibility of deliberately being irrational, but I don’t remember seeing any posts examining if it’s ever best to just give up and stop trying to learn a particular skill of rationality.
For example, someone who is extremely risk-averse, and experiences severe psychological discomfort in situations involving risk, and who has spent years trying to overcome this problem with no success. Should this person keep trying to overcome the risk aversion, or just give up and never leave their comfort zone, focusing instead on strategies for avoiding situations involving risk?
yes, the “someone” I mention above is myself.
and yes, I am asking this hoping that the answer gives me an excuse to be lazy.
I’m surprised that noone gave the obvious answer yet, which is:
If overcoming the problem really is hopeless, then give up and focus on more productive things, otherwise keep trying.
If it isn’t obvious whether it’s hopeless or not, then do a more detailed cost/benefit analysis.
Still, I don’t remember seeing any LW post that even mentioned that sometimes giving up is an acceptable option. Or maybe I just forgot, or didn’t notice.
http://lesswrong.com/lw/gx/just_lose_hope_already/ ?
Yes, that link is relevant and helpful, thanks.
It’s not specifically about giving up on overcoming a particular irrational behaviour, but I guess the same advice applies.
This is random and for all sorts of reasons possibly a bad idea- but have you ever thought about anti-anxiety medication? It might have side effects that turn you off of it but it could help you deal with high risk situations.
(I should disclaim: I’m not a doctor, my knowledge doesn’t extend past personal experience and a cog sci minor. Obviously, not medical advice, etc.)
I personally didn’t suggest it because it seemed like it’s obvious to you, so the only interesting response would be to deny it for some good reason.
I would note that you shouldn’t give up permanently. Maybe wait a year or a few, then see if you’ve grown in other ways that would make a new attempt more fruitful.
upvoted. good advice. thanks.
It’s been hinted at a few times, usually in terms of “how to pick goals” rather than “when to give up on goals”. AFAIK, never a top-level post of “maybe you should give up and do something easier and/or more productive toward other goals”. I think it’d be valuable.
I was hoping this would get more of a response—Peer and I have spent a considerable bit of time talking about this, and it’s gotten to the point where other perspectives would be useful.
My opinion is that it is, at a minimum, appropriate for someone in Peer’s situation to accept the fact that they are nearly guaranteed to be overwhelmed by emotion, to the point of becoming dangerously irrational, in certain situations, and to take that fact into account in deciding what problems to try to tackle. And, I see it as irrational to feel guilty or panicky about not being able to do more.
Part of the problem, though, is that the risky situations Peer mentioned are SIAI-related, and he seems to see doing anything less than his theoretical best (without taking psychological issues into account) in that context as not just lazy but immoral in some sense.
Peer’s comment is too vague and general for any meaningful response, and your comment doesn’t add clarity (“Risky situations Peer mentioned are SIAI-related”?).
“Risk aversion”? In one interpretation it’s a perfectly valid aspect of preference, not something that needs overcoming. For example, one can value sure-thing $1000 more than 11% probability at $10000.
I’m trying not to say anything here that’s more Peer’s business than mine, so I don’t want to use real examples, and I’m not certain enough that I know the details of what’s going on in Peer’s head to make up examples, but it doesn’t appear to be risk-aversion by that definition that’s the problem. It’s that when he’s in what appears to him to be a high-stakes situation (and ‘what appears to him to be’ is very relevant there—this isn’t a calculated response as far as I can tell, and being told by, for example, Michael Vassar that the risk in some situation is worth the reward is nearly useless), he panics, and winds up doing things that make the issue worse in some way—usually in the form of wasting a lot of energy by going around in circles and then eventually backing out of dealing with the situation at all.
Is this what’s referred to as “choking under pressure”?
Yes, that seems like a reasonably accurate summary.
Everything Adelene has said so far is accurate.
Sorry, but I still haven’t thought of a good example that wouldn’t take too long to explain.
Another topic that Ade and I have been discussing is the difference between my idealized utility function (in which a major component is “maximize the probability that the Singularity turns out okay”), and whatever it is that actually controls my decisions (in which a major component is “avoid situations where my actions have a significant probability of making things worse”)
(I think there was at least one LW post on the topic of the difference between these two utility functions, but I didn’t find them after a quick search.)
So to answer Vladimir’s question, in my idealized utility function, certainty is not inherently valuable, and I know that when faced with a choice between certainty and uncertainty, I should shut up and multiply. However, my actual utility function has a paralyzing inability to deal with uncertainty.
Other relevant details are:
*severe underconfidence
*lack of experience, common sense, and general sanity
*fear of responsibility
*an inability to deal with (what appear to be) high-stakes situations. A risk of losing $1000 is already enough to qualify as “paralyzingly high stakes”.
Hmmm… Yeah, anxiety sucks.
You know, physiologically, fear and excitement are very similar. My Psychology 101 textbook mentioned an experiment in which experimental subjects who met a young woman in a situation where the environment was scary (a narrow bridge over a deep chasm) reported her as being more attractive than subjects who met her in a neutral setting. Many people are afraid of public speaking or otherwise performing before an audience. I’m something of an exception, because I find it exciting instead of scary. Maybe some practice at turning fear into excitement could help? I don’t know exactly how to do that, but you could try watching scary movies, or riding roller coasters, or playing games competitively, or something like that.
Also, perhaps another possible way to deal is to not care as much about the outcome? Always look on the bright side of life, and all that. Maybe I’ve just read too much fiction and played too many video games, but it seems like things usually do tend to work out okay. After all, humanity did survive the Cold War without blowing itself up. I don’t know how to do this, but if you think you could try to take a more abstract and less personal perspective on whatever is scaring you, it might help.
Well, one way I do nothing is by reading LessWrong and other blogs, and posting comments. I tend to be hesitant to give authoritative advice about dealing with personal issues, as I’m probably more screwed up than average, but I can still make suggestions. I find it hard to imagine myself as a counselor of any kind, though.
As for “better than video games”, sometimes yes, sometimes no. It depends a lot on the particular video game.
I feel it’s a curiosity stopper to think of browsing the Internet as “doing nothing”. You learn, you communicate, you help, you signal your expertise. Find better understanding of the gist of your motivation and turn it into a sustainable plan for driving your day-to-day activity (in particular for making some money).
It’s not so much “doing nothing” as “something I do for no other reason than it’s become part of my standard routine”. I think I’ve become very much driven by habit; I have a tendency to keep playing a video game even after I’ve decided I don’t like it very much and have plenty of others I could be playing.
Sometimes I play through my videogames repeatedly trying to set time records. (OK, I’ve only really done that once, for a couple weeks.)
To quote a friend of mine, ‘it’s pointless to doubt yourself. It only reduces what you can do.’
My meta-suggestion is to find things that you enjoy or care about (not the same thing) enough to put effort into handling them them better. Giving advice in general doesn’t seem to fall into that category—I don’t remember seeing you do it regularly, which is the only measure I really have access to—but you seemed pretty engaged in this case, so there may be an aspect of this situation that you care about more than you would care about a run-of-the-mill situation. If there is, and if you can figure out what it is, you can use that information to find more things of that type, which is likely to be useful—you run into that ‘having something to protect’ effect.
Well given that I don’t know what you’ve actually tried its hard to say if I think you’ve exausted your options (though it sounds like this sort of think might be best served by professional therapy). But sure, if the situations is really that bleak (assuming you have outside confirmation of this) then yeah give it up. Work on something else. Does your psychological discomfort come with any risk? Or just when particular kinds of things are at risk?
Relatedly, has anyone thought about the relationship between rationality and psychotherapy? It just occurred to me that there might be a lot there.
It puts the ‘R’ in REBT.
Huh? You mean, like, psychotherapists are unusually irrational people? Or maybe that no rationalist would give any significant credence to any of the clinical psychology theory? Or maybe that a good rationalist will rarely need psychotherapy because their deduction skills are much better than most therapists? Please explain.
To be less snide, I find it quite unlikely that therapy would help PI significantly. (Of course, I know little of his/her specific circumstances.) I think a more fruitful course of action, if PI does want to overcome the problem*, would be to keep trying to overcome it directly, and meanwhile continue to form new free relationships with a variety of trusted people and see if they can help at all by providing insight or emotional support. Social networks are better than yellow book pages at finding people with relevant insights. And good friends are better than good therapists at emotional support.
* Which isn’t to say that PI should keep trying.
It is possible that therapy isn’t usually cost effective but I don’t know of any study which suggests the therapist market is uniquely distorted. People pay a lot of money for a good therapists and therapists build their practice by way of references. I don’t think I have to endorse Freudian psychoanalysis in order to think that talking to an experienced stranger about your problems might be helpful in ways that talking to friends wouldn’t be. I don’t know the details of peer’s problem (and sorry peer, for hijacking this) but his risk aversion might extend to fear of losing social capital and being embarassed. If thats the case telling him to go make more friends and tell them about his problems seems to miss the point.
What I meant by a relationship between rationality and psychotherapy is that therapy often involves getting people to be happier by having them behave more rationally. It seems to me that that some of the methods and ideas discussed and used here could bear on therapeutic practice. Frankly, better than talking to friends for free (therapy from people you have other relationships is always going to be more complicated since there are all sorts of signaling and status issues that will get in the way of an honest dialog) would be talking to rationalist strangers for free. I imagine the Bayesian cult leaders of Eliezer’s fiction could charge a nice fee for talking to people and helping them make life decisions free from bias and overcome akrasia. We’ve all recognized that a lot of the material that gets discussed here looks like less useless self-help. To me, that means that this material might also be less useless other-help.
I sort of doubt it- but it would be great if to know if there are any practicing therapists or social workers that read less wrong.
Certainly. I didn’t get the impression that that was the case from his comment, but perhaps it is.
My main beef with therapy is that it’s ineffective at this. (Not in all cases, but more likely in the case of LW members.) It’s certainly a noble goal.
I think you’re saying here that you don’t have to endorse any particular methodology in order to think etc. I agree with the conditional, but I somewhat disagree with the consequent.
I write about my personal experience with therapy on my blog, which certainly informs my writings here.
I more or less agree with this. I was smarter than my therapist too but it was still helpful for three reasons. First, it forced me to recite my motives, reasons and feelings out loud which made me more conscious of them so that I could actually analyze and evaluate them. Second, the questions she asked prompted new thoughts that I wouldn’t have had. Even if the premise of her questions was silly (she wasn’t a Freudian but had a tendency to bring up my mother at inopportune times) it still brought forth helpful thoughts. Third, while she was behind me in IQ she had enough experience and knowledge of patterns of behavior to call me on my bullshit. In my experience (and as I understand it, in studies) intelligent people are especially good at rationalizing away behavior and channeling emotional reactions in weird, unhelpful directions.
Anyway thats what I got out of it. Eventually I think I reached a point of diminishing returns on it (once I could recognize patterns in my behavior paying money to have someone else do it did seem useless). I still have a problem of putting my conclusions about my own unhealthy, irrational behavior to good use, but that doesn’t seem like the kind of thing anyone will be able to help me with.
You’re definitely right that therapy is overall too ineffective- which is why I think it could benefit from the insights of this site. I actually think I could get a fair amount out of therapy with an extreme rationalist- and reading your blog it seems like your problem with therapists is that they’re not enough like your average less wrong poster.
Hmm. Maybe I was born unusually introspective, because my therapists never deepened my analysis or called me on bullshit. My experience may be more atypical than I thought.
I haven’t heard of those studies. I’d be interested in any references you have. I’m familiar with the correlation between intelligence and kookiness, but this sounds a bit different, though probably related.
Heh. Well, sort of. That and, maybe, that I’m just not cut out for therapy.
This doesn’t look like a hijack to me. I haven’t suggested therapy to Peer, probably because I’m pretty strongly biased against doing so, but now that I think about it, it may be useful to at least consider it.
Carry on. :)
I agree. There are things that are part of you, but that you pretty much have to treat as external facts. Some of those are qualities of your utility function, such as risk aversion. I would not even try to change those.
Others are about abilities, like emotional beviour, or akrasia of various kinds. Those you can try to change, but sometimes that is not possible, or would cost more than it is worth, and then you just accept them and concentrate on other things.