I never had a scientific intuition. In college, I once saw a physics demonstration with a cathode ray tube—moving a magnet bent the beam of light that showed the path of the electrons. I had never seen electrons before and it occurred to me that I had never really believed in the equations in my physics book; I knew they were the right answers to give on tests, but I wouldn’t have expected to see them work.
I’m also missing the ability to estimate. Draw a line on a sheet of paper; put a dot where 75% is. Then check if you got it right. I always get that sort of thing wrong. Arithmetic estimation is even harder. Deciding how to bet in a betting game? Next to impossible.
Whatever mechanism is that matches theory to reality, mine doesn’t work very well. Whatever mechanism derives expectations about the world from probability numbers, mine hardly works at all. This is why I actually can double-think. I can see an idea as logical without believing in it.
A literate person cannot look at a sentence without reading it. But a small child, just learning to read, can look at letters on a page without reading, and has to make an extra effort to read them. In the same way, a bad rationalist can see that an idea is true, without believing it. I can read about electromagnetism and still not expect to see the beam in the cathode ray tube bend. I spent ten years or so thinking “Isn’t it odd that the best arguments are on the atheist side?” without once wondering whether I should be an atheist.
Should I break down that barrier? I’m not sure. I’d do it if it would allow me to make money, I think. But not if it came at the cost of some kind of screaming Cthulhu horror.
You know what I really wish I had? Team spirit. Absolute group loyalty. Faith. Patriotism. The sense of being in the right. In Hoc Signo Vinces. I have fleeting glimpses of it but it doesn’t last. I want it enough that I keep fantasizing about joining the Army because it might work. I always wanted to be a fanatic, and my brain would never do it. But I’m starting to wonder if that’s hackable; I’m sure enough sleep deprivation and ritual would do it.
Should I break down that barrier? I’m not sure. I’d do it if it would allow me to make money, I think. But not if it came at the cost of some kind of screaming Cthulhu horror.
You know what I really wish I had? Team spirit. Absolute group loyalty. Faith. Patriotism. The sense of being in the right. In Hoc Signo Vinces. I have fleeting glimpses of it but it doesn’t last. I want it enough that I keep fantasizing about joining the Army because it might work. I always wanted to be a fanatic, and my brain would never do it. But I’m starting to wonder if that’s hackable; I’m sure enough sleep deprivation and ritual would do it.
Absolute group loyalty is much more likely to lead you to a screaming Cthulhu horror than the pursuit of truth is. Especially if it comes from a combination of ritual and sleep deprivation.
I never had a scientific intuition. In college, I once saw a physics demonstration with a cathode ray tube—moving a magnet bent the beam of light that showed the path of the electrons. I had never seen electrons before and it occurred to me that I had never really believed in the equations in my physics book; I knew they were the right answers to give on tests, but I wouldn’t have expected to see them work.
Intuitively connecting mathy physics to reality isn’t the default; you need to watch demonstrations and conduct thought experiments to make those connections. Your intuition got better that day.
I’m not sure. It’s just that if it did I wouldn’t go for it.
I know one person who’s really well calibrated with probability, due to a lot of practice with poker and finance. When something actually is an x% probability, he actually internalizes it—he really expects it to happen x% of the time. He’s 80% likely to be right about something if he says he has an 80% confidence.
He doesn’t seem too bad off. Busy and stressed, yes, but not particularly sad. Cheerful, even.
You talk about belief the way popular culture talks about love: as some kind of external influence that overcomes your resistance.
And belief can be like that, sure. But belief can also be the result of doing the necessary work.
I realize that’s an uncomfortable idea. But it’s also an important one.
Relatedly, my own thoughts on the value of truth: when the environment is very forgiving and even suboptimal choices mostly work out to my benefit, the cost of being incorrect a lot is mostly opportunity cost. That is, things go OK, and even get better sometimes. (Not as much better as they would have gotten had I optimized more, but still: better.)
I’ve spent most of my life in a forgiving environment, which makes it very easy to adopt the attitude that having accurate beliefs isn’t particularly important. I can go through life giving up lots of opportunities, and if I just don’t think too much about the improvements I’m giving up I’ll still be relatively content. It’s emotionally easy to discount possible future benefits.
Even if I do have transient moments of awareness of how much better it can be, I can suppress them by thinking about all the ways it can be worse and how much safer I am right where I am, as though refusing to climb somehow protected me from falling.
The thing is: when the environment is risky and most things cost me, the cost of being incorrect is loss. That is, things don’t go OK, and they get worse. And I can’t control the environment.
It’s emotionally harder to discount possible future losses.
I was always under the impression that a sort of “work” can lead you to emotionally believe things that you already know to be true in principle. I suspect that a lot of practice in actually believing what you know will eventually cause the gap between knowing and believing to disappear. (Sort of the way that practice in reading eventually produces a person who can’t look at a sentence without reading it.)
For example, I imagine that if you played some kind of betting game every day and made an effort to be realistic, you would stop expecting that wishing really hard for low-probability events could help you win. Your intuition/subconscious would eventually sync up with what you know to be true.
Similarly: acting on the basis of what I believe, even if my emotions aren’t fully aligned with those beliefs (for example, doing things I believe are valuable even if they scare me, or avoiding things I believe are risky even if they feel really enticing), can often cause my emotions to change over time.
But even if my emotions don’t change, my beliefs and my behavior still do, and that has effects.
This is particularly relevant for beliefs that are strongly associated with things like group memberships, such as in the atheism example you mention.
I was always under the impression that a sort of “work” can lead you to emotionally believe things that you already know to be true in principle.
I strongly associate this with Eliezer’s description of the brain as a cognitive engine, that needs to a certain amount of thermodynamical work to arrive at a certainty level—and that reasoned and logical conclusions that you ‘know’ fail to produce belief (enough certainty to act on knowledge) because they don’t make your brain do enough work.
I imagine that forcing someone to deduce bits of probability math from earlier principles and observations, then have them use it to analyze betting games until they can generalise to concepts like expected value, would be enough work to have them believe probability theory.
Should I break down that barrier? I’m not sure. I’d do it if it would allow me to make money, I think. But not if it came at the cost of some kind of screaming Cthulhu horror.
Not to other-optimise, but yes.
As far as I can tell, the chances of encountering a true idea that is also a Lovecraftian cosmic horror is below the vanishing point for human brains. (There aren’t neurons small enough to accurately reflect the tiny chances, etc)
It will also help you make money. Example: I received a promotion for demonstrating my ability to make more efficient rosters. This ability came from googling “scheduling problem” and looking at some common solutions, recognising that GRASP-type (page 7) solutions were effective and probably human-brain-computable—and then when I tried rostering, I intuitively implemented a pseduo-GRASP method.
That “intuitively implemented” bit is really important. You might not realise how much you rely on your intuition to decide for you, but it’s a lot. It sounds like taking a lot of theory and jamming it into your intuition is the hard part for you.
Tangentially, how do you feel about the wisdom of age and the value of experience in making decisions?
I think wisdom and experience are pretty good things—not sure how that relates though.
And “screaming Cthulhu horror” was just a cute phrase—I don’t literally believe in Lovecraft. I just mean “if rationality results in extreme misery, I’ll take a pass.”
I think wisdom and experience are pretty good things—not sure how that relates though.
Some people I have encountered struggle with my rationality because I often privilege general laws derived from decision theory and statistics over my own personal experience—like playing tit-for-tat when my gut is screaming defection rock, or participating in mutual fantasising about lottery wins but refusing to buy ‘even one’ lottery ticket. I have found that certain attitudes towards experience and age-wisdom can affect a person’s ability to tag ideas with ‘true in the real world’ - that reason and logic can only achieve ‘true but not actually applicable in the real world’. It was a possibility I thought I should check.
And “screaming Cthulhu horror” was just a cute phrase—I don’t literally believe in Lovecraft.
I assumed it was a reference to concepts like Roko’s idea. As for regular extreme misery, yes, there is a case for rationality being negative. You would probably need some irrational beliefs (that you refuse to rationally examine) that prevent you from taking paths where rationality produces misery. You could probably get a half-decent picture of what paths these might be from questioning LessWrong about it, but that only reduces the chance—still a consideration.
I’m also missing the ability to estimate. Draw a line on a sheet of paper; put a dot where 75% is. Then check if you got it right. I always get that sort of thing wrong. Arithmetic estimation is even harder. Deciding how to bet in a betting game? Next to impossible.
Whatever mechanism is that matches theory to reality, mine doesn’t work very well. Whatever mechanism derives expectations about the world from probability numbers, mine hardly works at all. This is why I actually can double-think. I can see an idea as logical without believing in it.
I am under the impression that much of Eliezer Yudkowsky’s early sequence posts were writted based on (a) theory and (b) experience with general-artificial-intelligence Internet posters. It’s entirely possible that his is a correct deduction only on that weird WEIRD group.
I wasn’t talking about that aspect (although I think he’s wrong there also) but just about the aspect of not doing a good job at things like estimating or mapping probabilities to reality.
I think it’s really the same thing. Mapping probabilities to reality is sort of the quantitative version of matching degree of belief to amount of evidence.
Possibly taboo self-delusion? I’m not sure that’s what he means. Self-delusion in this context seems to mean something closer to deliberately modifying your confidence in a way that isn’t based on evidence.
Should I break down that barrier? I’m not sure. I’d do it if it would allow me to make money, I think. But not if it came at the cost of some kind of screaming Cthulhu horror.
This sounds like worrying about tripping over a conceptual basilisk. They really are remarkably rare unless your brain is actually dysfunctional or you’ve induced a susceptibility in yourself. Despite the popularity of the motif of harmful sensation in fiction, I know of pretty much no examples.
Draw a line on a sheet of paper; put a dot where 75% is. Then check if you got it right.
I tried that one and got it just about spot on. If you had asked me to estimate 67% now that may have been tricky. Estimating half twice in your head is kind of easy.
If you had asked me to estimate 67% now that may have been tricky.
Move your estimation point until half the big side is the same as the little side. (Although I’ve practiced enough to do halves, thirds, and fifths pretty well, so I might just be overgeneralizing my experience.)
Move your estimation point until half the big side is the same as the little side. (Although I’ve practiced enough to do halves, thirds, and fifths pretty well, so I might just be overgeneralizing my experience.)
Damn. I chose two random numbers and made a probability out of them. It seems I picked one of the easy ones too! :)
And yes, that algorithm does seem to work well for thirds. I lose a fair bit of accuracy but it isn’t down to ‘default human estimation mode’ level.
I’m through with truth.
I never had a scientific intuition. In college, I once saw a physics demonstration with a cathode ray tube—moving a magnet bent the beam of light that showed the path of the electrons. I had never seen electrons before and it occurred to me that I had never really believed in the equations in my physics book; I knew they were the right answers to give on tests, but I wouldn’t have expected to see them work.
I’m also missing the ability to estimate. Draw a line on a sheet of paper; put a dot where 75% is. Then check if you got it right. I always get that sort of thing wrong. Arithmetic estimation is even harder. Deciding how to bet in a betting game? Next to impossible.
Whatever mechanism is that matches theory to reality, mine doesn’t work very well. Whatever mechanism derives expectations about the world from probability numbers, mine hardly works at all. This is why I actually can double-think. I can see an idea as logical without believing in it.
A literate person cannot look at a sentence without reading it. But a small child, just learning to read, can look at letters on a page without reading, and has to make an extra effort to read them. In the same way, a bad rationalist can see that an idea is true, without believing it. I can read about electromagnetism and still not expect to see the beam in the cathode ray tube bend. I spent ten years or so thinking “Isn’t it odd that the best arguments are on the atheist side?” without once wondering whether I should be an atheist.
Should I break down that barrier? I’m not sure. I’d do it if it would allow me to make money, I think. But not if it came at the cost of some kind of screaming Cthulhu horror.
You know what I really wish I had? Team spirit. Absolute group loyalty. Faith. Patriotism. The sense of being in the right. In Hoc Signo Vinces. I have fleeting glimpses of it but it doesn’t last. I want it enough that I keep fantasizing about joining the Army because it might work. I always wanted to be a fanatic, and my brain would never do it. But I’m starting to wonder if that’s hackable; I’m sure enough sleep deprivation and ritual would do it.
Absolute group loyalty is much more likely to lead you to a screaming Cthulhu horror than the pursuit of truth is. Especially if it comes from a combination of ritual and sleep deprivation.
Ok, worth thinking about.
I still want it. At times I really want victory, not just a normal life. Even though “normal” is all a person should really expect.
Intuitively connecting mathy physics to reality isn’t the default; you need to watch demonstrations and conduct thought experiments to make those connections. Your intuition got better that day.
Why would you expect it to come at the cost of some kind of screaming Cthulhu horror?
I’m not sure. It’s just that if it did I wouldn’t go for it.
I know one person who’s really well calibrated with probability, due to a lot of practice with poker and finance. When something actually is an x% probability, he actually internalizes it—he really expects it to happen x% of the time. He’s 80% likely to be right about something if he says he has an 80% confidence.
He doesn’t seem too bad off. Busy and stressed, yes, but not particularly sad. Cheerful, even.
You talk about belief the way popular culture talks about love: as some kind of external influence that overcomes your resistance.
And belief can be like that, sure. But belief can also be the result of doing the necessary work.
I realize that’s an uncomfortable idea. But it’s also an important one.
Relatedly, my own thoughts on the value of truth: when the environment is very forgiving and even suboptimal choices mostly work out to my benefit, the cost of being incorrect a lot is mostly opportunity cost. That is, things go OK, and even get better sometimes. (Not as much better as they would have gotten had I optimized more, but still: better.)
I’ve spent most of my life in a forgiving environment, which makes it very easy to adopt the attitude that having accurate beliefs isn’t particularly important. I can go through life giving up lots of opportunities, and if I just don’t think too much about the improvements I’m giving up I’ll still be relatively content. It’s emotionally easy to discount possible future benefits.
Even if I do have transient moments of awareness of how much better it can be, I can suppress them by thinking about all the ways it can be worse and how much safer I am right where I am, as though refusing to climb somehow protected me from falling.
The thing is: when the environment is risky and most things cost me, the cost of being incorrect is loss. That is, things don’t go OK, and they get worse. And I can’t control the environment.
It’s emotionally harder to discount possible future losses.
I was always under the impression that a sort of “work” can lead you to emotionally believe things that you already know to be true in principle. I suspect that a lot of practice in actually believing what you know will eventually cause the gap between knowing and believing to disappear. (Sort of the way that practice in reading eventually produces a person who can’t look at a sentence without reading it.)
For example, I imagine that if you played some kind of betting game every day and made an effort to be realistic, you would stop expecting that wishing really hard for low-probability events could help you win. Your intuition/subconscious would eventually sync up with what you know to be true.
(nods) That’s been my experience.
Similarly: acting on the basis of what I believe, even if my emotions aren’t fully aligned with those beliefs (for example, doing things I believe are valuable even if they scare me, or avoiding things I believe are risky even if they feel really enticing), can often cause my emotions to change over time.
But even if my emotions don’t change, my beliefs and my behavior still do, and that has effects.
This is particularly relevant for beliefs that are strongly associated with things like group memberships, such as in the atheism example you mention.
I strongly associate this with Eliezer’s description of the brain as a cognitive engine, that needs to a certain amount of thermodynamical work to arrive at a certainty level—and that reasoned and logical conclusions that you ‘know’ fail to produce belief (enough certainty to act on knowledge) because they don’t make your brain do enough work.
I imagine that forcing someone to deduce bits of probability math from earlier principles and observations, then have them use it to analyze betting games until they can generalise to concepts like expected value, would be enough work to have them believe probability theory.
Not to other-optimise, but yes.
As far as I can tell, the chances of encountering a true idea that is also a Lovecraftian cosmic horror is below the vanishing point for human brains. (There aren’t neurons small enough to accurately reflect the tiny chances, etc)
It will also help you make money. Example: I received a promotion for demonstrating my ability to make more efficient rosters. This ability came from googling “scheduling problem” and looking at some common solutions, recognising that GRASP-type (page 7) solutions were effective and probably human-brain-computable—and then when I tried rostering, I intuitively implemented a pseduo-GRASP method.
That “intuitively implemented” bit is really important. You might not realise how much you rely on your intuition to decide for you, but it’s a lot. It sounds like taking a lot of theory and jamming it into your intuition is the hard part for you.
Tangentially, how do you feel about the wisdom of age and the value of experience in making decisions?
I think wisdom and experience are pretty good things—not sure how that relates though.
And “screaming Cthulhu horror” was just a cute phrase—I don’t literally believe in Lovecraft. I just mean “if rationality results in extreme misery, I’ll take a pass.”
Some people I have encountered struggle with my rationality because I often privilege general laws derived from decision theory and statistics over my own personal experience—like playing tit-for-tat when my gut is screaming defection rock, or participating in mutual fantasising about lottery wins but refusing to buy ‘even one’ lottery ticket. I have found that certain attitudes towards experience and age-wisdom can affect a person’s ability to tag ideas with ‘true in the real world’ - that reason and logic can only achieve ‘true but not actually applicable in the real world’. It was a possibility I thought I should check.
I assumed it was a reference to concepts like Roko’s idea. As for regular extreme misery, yes, there is a case for rationality being negative. You would probably need some irrational beliefs (that you refuse to rationally examine) that prevent you from taking paths where rationality produces misery. You could probably get a half-decent picture of what paths these might be from questioning LessWrong about it, but that only reduces the chance—still a consideration.
Congratulations. You’re just like most humans.
Well, then why does he say self-delusion is impossible? It’s not only possible, it’s usual.
I am under the impression that much of Eliezer Yudkowsky’s early sequence posts were writted based on (a) theory and (b) experience with general-artificial-intelligence Internet posters. It’s entirely possible that his is a correct deduction only on that weird WEIRD group.
I wasn’t talking about that aspect (although I think he’s wrong there also) but just about the aspect of not doing a good job at things like estimating or mapping probabilities to reality.
I think it’s really the same thing. Mapping probabilities to reality is sort of the quantitative version of matching degree of belief to amount of evidence.
Possibly taboo self-delusion? I’m not sure that’s what he means. Self-delusion in this context seems to mean something closer to deliberately modifying your confidence in a way that isn’t based on evidence.
This sounds like worrying about tripping over a conceptual basilisk. They really are remarkably rare unless your brain is actually dysfunctional or you’ve induced a susceptibility in yourself. Despite the popularity of the motif of harmful sensation in fiction, I know of pretty much no examples.
I tried that one and got it just about spot on. If you had asked me to estimate 67% now that may have been tricky. Estimating half twice in your head is kind of easy.
Move your estimation point until half the big side is the same as the little side. (Although I’ve practiced enough to do halves, thirds, and fifths pretty well, so I might just be overgeneralizing my experience.)
Damn. I chose two random numbers and made a probability out of them. It seems I picked one of the easy ones too! :)
And yes, that algorithm does seem to work well for thirds. I lose a fair bit of accuracy but it isn’t down to ‘default human estimation mode’ level.