Have you observed my discussions elsewhere on this website, and came to the conclusion that I’m way too confident in that way in general, or are you referring only to this particular exchange?
Only this particular exchange, I haven’t seen any of your other discussions.
It’s not you clearly signaling you think I’m obviously wrong that I anticipate difficulties with; I was being imprecise. Rather, it’s a specific emotion/attitude (exasperation?) that I detect and that stresses me out a lot, because it imposes a moral obligation on me to act in good faith to show you that the kind of reasoning you’re engaged in in my experience often leads to terrible consequences that will look in retrospect as if they could easily have been avoided. On the one hand I want to try to help you, on the other hand I want to avoid blame for not having tried to help you enough, and there’s no obvious solution to that double bind, and the easiest solution is to simply bail out of the discussion. (Not necessarily your blame, just someone’s, e.g. God’s.)
And it’s not you thinking that I’m obviously wrong; apologies for being unclear. It’s people in general. You say you “usually can’t even tell what the hell most religious people are talking about from an epistemic or clear communication standpoint”, and yet you’re very confident they are wrong. People who haven’t practiced the art of analyzing people’s decision policies in terms of signaling games, Schelling points, social psychology &c. simply don’t have the skills necessary to determine whether they’re justified in strongly disagreeing with someone. Confidently assuming that your enemies are stupid is what basically everyone does, and they’re all retarded for doing it. LessWrong is no exception; in fact, it’s a lot worse than my high school friends, who weren’t fooled into thinking that their opinions were worth something ’cuz of a superficial knowledge of cognitive science and Bayesian statistics.
It’s not that I don’t think you’d update. If I took the time to lay out all my arguments, or had time to engage you often in conversation, as I have done with many folk from the SingInst community, then I’m sure I would cause you to massively update towards thinking I’m right and that LessWrong has gaping holes in its epistemology. It’s happened many times now. People start out thinking I’m crazy or obviously wrong or just being contrarian, I talk to them for a long time, they realize I have very good epistemic habits and kick themselves for not seeing it earlier. But it takes time, and LessWrong isn’t worth my time; the only reason I comment on LessWrong is because I feel a moral obligation to, and the moral obligation isn’t strong enough to compel me to do it well.
Also, I generally don’t like talking about object level beliefs; I prefer to discuss epistemology. But I’m too lazy to have long, involved discussions about epistemology, so I wouldn’t have been able to keep up our discussion either way.
Rather, it’s a specific emotion/attitude (exasperation?) that I detect and that stresses me out a lot, because it imposes a moral obligation on me to act in good faith to show you that the kind of reasoning you’re engaged in in my experience often leads to terrible consequences that will look in retrospect as if they could easily have been avoided.
I just don’t understand. I see why you may detect a level of exasperation in my replies, but I don’t get why that specifically would be what would impose that sort of moral obligation on you. You’re saying that what I’m doing may lead to terrible consequences, which sounds bad and like maybe you should do something about it, but I’m utterly confused about why my attitude is what confers that on you.
In other words, wouldn’t you feel just as morally obligated (if not more) to help me avoid such terrible consequences if I had handled this discussion with a higher level of respect or grace? Why does me (accidentally or not) signaling exasperation or annoyance lead to that feeling of moral obligation, rather than the simple fact that you consider it in your power to help somebody avoid (or lower the likelihood) of whatever horrible outcome you have in mind?
When I was first reading your reply and had only reached up to where you said “stresses me out a lot”, I thought you were just going to say that me acting frustrated with you or whatever was simply making it uncomfortable or like you would get emotionally attached such that it would be epistemically hazardous or something, which I would have understood, but then you transitioned to the whole moral obligation thing and I sort of lost you.
On the one hand I want to try to help you
Just for reference, I should probably tell you what (I think) my utility function is, so you’re in a (better) position to appraise whether what you have in mind really would be of help to me.
I’m completely and utterly disinterested in academic or intellectual matters unless they somehow directly benefit me in the more mundane, base aspects of my life. Unless a piece of information is apt to make me better at parkour, lifting, socializing, running, etc., or enable me to eat healthier so I’m less likely to get sick or come down with a terrible disease, or something like that, it’s not useful to me.
If studying some science or learning some new esoteric fact or correcting some intellectual error of mine could help me get to sleep on time, make it less likely for me to die anytime soon, make it less probable for me to suffer from depression, help me learn how to handle social interaction more effectively, tame my (sometimes extreme) akrasia, enable me to contribute to reducing the possibility of civilization-wide catastrophe in my lifetime, etc., then I’m interested. Otherwise I’m not.
I’m telling you this simply so you know what it means to help me out. If whatever you have in mind can’t be of use for me in my everyday life, then it’s not helpful. I hang out on this website, and engage in intellectual matters quite regularly, but I do so only because I think it’s the best way to fulfill my rather mundane utility function. We’re not designed properly for our current environment, and the only way to compensate is to engage in some pretty deep introspection and spend a lot of time and energy working through plenty of intellectual matters.
So what do you have that could help me? I want to live a healthy, happy human life, not have it cut short by some societal collapse, and also hopefully be around for when (or if) we put an end to aging and make it so we don’t have to die so young anymore. I also don’t want to suffer an eternity burning in Hell, that is if such a place exists.
And it’s not you thinking that I’m obviously wrong; apologies for being unclear. It’s people in general. You say you “usually can’t even tell what the hell most religious people are talking about from an epistemic or clear communication standpoint”, and yet you’re very confident they are wrong.
Oh sorry. I should have been more precise. I don’t think anything past what you quoted of me. If by “wrong”, you mean anything incompatible with not having any idea what they’re talking about, or rather just not being able to interpret what they’re saying as serious attempts at clear communication, then I certainly don’t think they’re wrong. I just think they’re either really bad at communicating, or else engaged in a different activity.
So yeah. In that sense, I don’t think they’re wrong, and I don’t think you’re wrong. I just don’t know what they’re attempting to communicate. Or rather, it seems pretty obvious to me that most religious people aren’t even trying to communicate at all, at least in the sense of intellectual discourse, or in terms of epistemic rationality. It seems pretty clear to me that they’re just employing a bunch of techniques to get themselves to believe certain things, or else they’re just repeating certain things because of some oddity in human brain design.
But there are a ton of different religions, and a ridiculous amount of variation from person to person, so I can’t really criticize them all at once or anything, nor would it matter. And as for you, at this point I really just have no idea what you believe. It’s not that I think you’re wrong about whatever beliefs you have. It’s that I still don’t know what those beliefs are, and also that I’m under the impression that you’re not doing a very good job with your attempts to communicate them to me.
In most discussions like this, the issue isn’t that somebody has a clear map that doesn’t fit the territory. It’s almost always just a matter of a communication failure or a set of key misinterpretations, or something like that. Likewise in this discussion. It’s not that I think what you believe is wrong; it’s that I don’t know even know what you believe.
People who haven’t practiced the art of analyzing people’s decision policies in terms of signaling games, Schelling points, social psychology &c. simply don’t have the skills necessary to determine whether they’re justified in strongly disagreeing with someone.
I can’t tell whether you’re implying that I specifically don’t have those skills, or whether you’re just making some general observation or something.
Confidently assuming that your enemies are stupid is what basically everyone does, and they’re all retarded for doing it.
I certainly don’t do that. When you disagree with somebody, there’s no getting around thinking that they’re making an error (because it’s like that by definition), but considering them “stupid” is nothing more than an empty explanation. Much more useful would be saying that your opponent thinks X because he’s operating under some bias Y, or something like that.
In other words, I probably engage in plenty of discussions where I consider my opponent to be making a serious error, or not very good at managing the inferential distance properly, or ridiculously apt to make word-based errors, or whatever, but I never settle for the thought-terminating explanation that they’re just stupid, or at least I don’t think I do. Or do I?
There’s just no getting around appraising the level of intellectual ability your opponent is operating on, just like I would never play a match of tennis with somebody who sucks without acknowledging that to myself. It’s saying “he sucks” without even considering why exactly that is the case that’s the problem. When I engage in intellectual discussions, I try to stick to observations like “he doesn’t define his terms precisely” rather than just “he’s an idiot”.
LessWrong is no exception; in fact, it’s a lot worse than my high school friends, who weren’t fooled into thinking that their opinions were worth something ’cuz of a superficial knowledge of cognitive science and Bayesian statistics.
Is this aimed at me also, or what?
It’s not that I don’t think you’d update. If I took the time to lay out all my arguments, or had time to engage you often in conversation, as I have done with many folk from the SingInst community, then I’m sure I would cause you to massively update towards thinking I’m right and that LessWrong has gaping holes in its epistemology.
If the received opinion on Less Wrong really does have gaping holes in its epistemology, then I’d like to be first in line to hear about it.
That said, I alone am not this entity we call “Less Wrong”. You’re telling me I’d update massively in your direction, which means you think I also have these gaping holes in my epistemology, but do you really know that through just these few posts back and forth we’ve had here?
It’s happened many times now. People start out thinking I’m crazy or obviously wrong or just being contrarian, I talk to them for a long time, they realize I have very good epistemic habits and kick themselves for not seeing it earlier.
with many folk from the SingInst community
Who are these people who updated massively in your direction, and would any of them be willing to explain what happened, or have they in some series of posts? You’re telling me that I could revolutionize my epistemic framework if I just listened to what you had to say, but then you’re leaving me hanging.
But it takes time, and LessWrong isn’t worth my time; the only reason I comment on LessWrong is because I feel a moral obligation to, and the moral obligation isn’t strong enough to compel me to do it well.
Are you sure you’re not just doing more harm than good by being so messy in your posts? And of course the important implication here is that I personally am not worth your time, or you would talk to me long enough to actually explain yourself.
I’m just left wondering why you’re still here, just as many other people probably have been, and of course also left wondering what sort of revolutionary idea you may be hiding.
People who haven’t practiced the art of analyzing people’s decision policies in terms of signaling games, Schelling points, social psychology &c. simply don’t have the skills necessary to determine whether they’re justified in strongly disagreeing with someone.
I can’t tell whether you’re implying that I specifically don’t have those skills, or whether you’re just making some general observation or something.
As far as I can tell, Will’s a stronger epistemic majoritarian than most nerds, including us LW nerds. If a bunch of people engage in a behavior, his default belief is that behavior is adaptive in a comprehensive enough context, when examined at a meta-enough level.
Will spend a lot of time practicing model-based thinking. Even with that specific focus, he doesn’t consider his own skills adequate to declare the average person’s behavior stupid and counterproductive. I’m an average LW’ian, I’ve read The Strategy of Conflict and the sequences and Overcoming Bias had a few related insights in my daily life. I don’t have enough skill to dissolve the question and write out a flowchart that shows why some of the smartest and most rational people in the world are religious. So Will’s not going to trust me when I say that they’re wrong.
And as for you, at this point I really just have no idea what you believe.
I’m just left wondering why you’re still here, just as many other people probably have been, and of course also left wondering what sort of revolutionary idea you may be hiding.
He suspects himself of prodromal schizophrenia, due to symptoms like continuing to post here.
Some of my majoritarianism is in some sense a rationalization, or at least it’s retrospective. I happened to reach various conclusions, some epistemic, some moral, and learned various things that happened to line up much better with Catholic dogma than with any other system of thought. Some of my majoritarianism stems from wondering how I could have reached those conclusions earlier or more reliably, without the benefit of epistemic luck, which I’ve had a lot of. I think the policy that pops out isn’t actually majoritarianism so much as harboring a deep respect for highly evolved institutions, a la Nick Szabo. There’s also Chesterton’s idea of orthodoxy as democracy spread over time. On matters where there’s little reason to expect great advancement of the moderns over older cultures, like in spirituality or morality, it would be foolish to adopt a modern-majoritarian position that ignored the opinions of those older cultures. I don’t actually have all that much respect for the “average person”, but I do have great respect for the pious and the intellectually humble. I honestly see more rationality in the humble creationist than in the protypical yay-science boo-religion liberal.
He’s a prospective modal catholic—replace each instance of “amen” with “or so we are led to believe.”
Though I think my actually converting is getting less likely the more I think about the issue and study recent Church history.
He suspects himself of prodromal schizophrenia, due to symptoms like continuing to post here.
More due to typical negative symptoms and auditory hallucinations and so on most prominent about six months ago, among a few other reasons. But perhaps it’s more accurate to characterize myself as schizotypal.
Only this particular exchange, I haven’t seen any of your other discussions.
It’s not you clearly signaling you think I’m obviously wrong that I anticipate difficulties with; I was being imprecise. Rather, it’s a specific emotion/attitude (exasperation?) that I detect and that stresses me out a lot, because it imposes a moral obligation on me to act in good faith to show you that the kind of reasoning you’re engaged in in my experience often leads to terrible consequences that will look in retrospect as if they could easily have been avoided. On the one hand I want to try to help you, on the other hand I want to avoid blame for not having tried to help you enough, and there’s no obvious solution to that double bind, and the easiest solution is to simply bail out of the discussion. (Not necessarily your blame, just someone’s, e.g. God’s.)
And it’s not you thinking that I’m obviously wrong; apologies for being unclear. It’s people in general. You say you “usually can’t even tell what the hell most religious people are talking about from an epistemic or clear communication standpoint”, and yet you’re very confident they are wrong. People who haven’t practiced the art of analyzing people’s decision policies in terms of signaling games, Schelling points, social psychology &c. simply don’t have the skills necessary to determine whether they’re justified in strongly disagreeing with someone. Confidently assuming that your enemies are stupid is what basically everyone does, and they’re all retarded for doing it. LessWrong is no exception; in fact, it’s a lot worse than my high school friends, who weren’t fooled into thinking that their opinions were worth something ’cuz of a superficial knowledge of cognitive science and Bayesian statistics.
It’s not that I don’t think you’d update. If I took the time to lay out all my arguments, or had time to engage you often in conversation, as I have done with many folk from the SingInst community, then I’m sure I would cause you to massively update towards thinking I’m right and that LessWrong has gaping holes in its epistemology. It’s happened many times now. People start out thinking I’m crazy or obviously wrong or just being contrarian, I talk to them for a long time, they realize I have very good epistemic habits and kick themselves for not seeing it earlier. But it takes time, and LessWrong isn’t worth my time; the only reason I comment on LessWrong is because I feel a moral obligation to, and the moral obligation isn’t strong enough to compel me to do it well.
Also, I generally don’t like talking about object level beliefs; I prefer to discuss epistemology. But I’m too lazy to have long, involved discussions about epistemology, so I wouldn’t have been able to keep up our discussion either way.
I just don’t understand. I see why you may detect a level of exasperation in my replies, but I don’t get why that specifically would be what would impose that sort of moral obligation on you. You’re saying that what I’m doing may lead to terrible consequences, which sounds bad and like maybe you should do something about it, but I’m utterly confused about why my attitude is what confers that on you.
In other words, wouldn’t you feel just as morally obligated (if not more) to help me avoid such terrible consequences if I had handled this discussion with a higher level of respect or grace? Why does me (accidentally or not) signaling exasperation or annoyance lead to that feeling of moral obligation, rather than the simple fact that you consider it in your power to help somebody avoid (or lower the likelihood) of whatever horrible outcome you have in mind?
When I was first reading your reply and had only reached up to where you said “stresses me out a lot”, I thought you were just going to say that me acting frustrated with you or whatever was simply making it uncomfortable or like you would get emotionally attached such that it would be epistemically hazardous or something, which I would have understood, but then you transitioned to the whole moral obligation thing and I sort of lost you.
Just for reference, I should probably tell you what (I think) my utility function is, so you’re in a (better) position to appraise whether what you have in mind really would be of help to me.
I’m completely and utterly disinterested in academic or intellectual matters unless they somehow directly benefit me in the more mundane, base aspects of my life. Unless a piece of information is apt to make me better at parkour, lifting, socializing, running, etc., or enable me to eat healthier so I’m less likely to get sick or come down with a terrible disease, or something like that, it’s not useful to me.
If studying some science or learning some new esoteric fact or correcting some intellectual error of mine could help me get to sleep on time, make it less likely for me to die anytime soon, make it less probable for me to suffer from depression, help me learn how to handle social interaction more effectively, tame my (sometimes extreme) akrasia, enable me to contribute to reducing the possibility of civilization-wide catastrophe in my lifetime, etc., then I’m interested. Otherwise I’m not.
I’m telling you this simply so you know what it means to help me out. If whatever you have in mind can’t be of use for me in my everyday life, then it’s not helpful. I hang out on this website, and engage in intellectual matters quite regularly, but I do so only because I think it’s the best way to fulfill my rather mundane utility function. We’re not designed properly for our current environment, and the only way to compensate is to engage in some pretty deep introspection and spend a lot of time and energy working through plenty of intellectual matters.
So what do you have that could help me? I want to live a healthy, happy human life, not have it cut short by some societal collapse, and also hopefully be around for when (or if) we put an end to aging and make it so we don’t have to die so young anymore. I also don’t want to suffer an eternity burning in Hell, that is if such a place exists.
Oh sorry. I should have been more precise. I don’t think anything past what you quoted of me. If by “wrong”, you mean anything incompatible with not having any idea what they’re talking about, or rather just not being able to interpret what they’re saying as serious attempts at clear communication, then I certainly don’t think they’re wrong. I just think they’re either really bad at communicating, or else engaged in a different activity.
So yeah. In that sense, I don’t think they’re wrong, and I don’t think you’re wrong. I just don’t know what they’re attempting to communicate. Or rather, it seems pretty obvious to me that most religious people aren’t even trying to communicate at all, at least in the sense of intellectual discourse, or in terms of epistemic rationality. It seems pretty clear to me that they’re just employing a bunch of techniques to get themselves to believe certain things, or else they’re just repeating certain things because of some oddity in human brain design.
But there are a ton of different religions, and a ridiculous amount of variation from person to person, so I can’t really criticize them all at once or anything, nor would it matter. And as for you, at this point I really just have no idea what you believe. It’s not that I think you’re wrong about whatever beliefs you have. It’s that I still don’t know what those beliefs are, and also that I’m under the impression that you’re not doing a very good job with your attempts to communicate them to me.
In most discussions like this, the issue isn’t that somebody has a clear map that doesn’t fit the territory. It’s almost always just a matter of a communication failure or a set of key misinterpretations, or something like that. Likewise in this discussion. It’s not that I think what you believe is wrong; it’s that I don’t know even know what you believe.
I can’t tell whether you’re implying that I specifically don’t have those skills, or whether you’re just making some general observation or something.
I certainly don’t do that. When you disagree with somebody, there’s no getting around thinking that they’re making an error (because it’s like that by definition), but considering them “stupid” is nothing more than an empty explanation. Much more useful would be saying that your opponent thinks X because he’s operating under some bias Y, or something like that.
In other words, I probably engage in plenty of discussions where I consider my opponent to be making a serious error, or not very good at managing the inferential distance properly, or ridiculously apt to make word-based errors, or whatever, but I never settle for the thought-terminating explanation that they’re just stupid, or at least I don’t think I do. Or do I?
There’s just no getting around appraising the level of intellectual ability your opponent is operating on, just like I would never play a match of tennis with somebody who sucks without acknowledging that to myself. It’s saying “he sucks” without even considering why exactly that is the case that’s the problem. When I engage in intellectual discussions, I try to stick to observations like “he doesn’t define his terms precisely” rather than just “he’s an idiot”.
Is this aimed at me also, or what?
If the received opinion on Less Wrong really does have gaping holes in its epistemology, then I’d like to be first in line to hear about it.
That said, I alone am not this entity we call “Less Wrong”. You’re telling me I’d update massively in your direction, which means you think I also have these gaping holes in my epistemology, but do you really know that through just these few posts back and forth we’ve had here?
Who are these people who updated massively in your direction, and would any of them be willing to explain what happened, or have they in some series of posts? You’re telling me that I could revolutionize my epistemic framework if I just listened to what you had to say, but then you’re leaving me hanging.
Are you sure you’re not just doing more harm than good by being so messy in your posts? And of course the important implication here is that I personally am not worth your time, or you would talk to me long enough to actually explain yourself.
I’m just left wondering why you’re still here, just as many other people probably have been, and of course also left wondering what sort of revolutionary idea you may be hiding.
I think I can translate, a bit:
As far as I can tell, Will’s a stronger epistemic majoritarian than most nerds, including us LW nerds. If a bunch of people engage in a behavior, his default belief is that behavior is adaptive in a comprehensive enough context, when examined at a meta-enough level.
Will spend a lot of time practicing model-based thinking. Even with that specific focus, he doesn’t consider his own skills adequate to declare the average person’s behavior stupid and counterproductive. I’m an average LW’ian, I’ve read The Strategy of Conflict and the sequences and Overcoming Bias had a few related insights in my daily life. I don’t have enough skill to dissolve the question and write out a flowchart that shows why some of the smartest and most rational people in the world are religious. So Will’s not going to trust me when I say that they’re wrong.
He’s a prospective modal catholic—replace each instance of “amen” with “or so we are led to believe.”
He suspects himself of prodromal schizophrenia, due to symptoms like continuing to post here.
Some of my majoritarianism is in some sense a rationalization, or at least it’s retrospective. I happened to reach various conclusions, some epistemic, some moral, and learned various things that happened to line up much better with Catholic dogma than with any other system of thought. Some of my majoritarianism stems from wondering how I could have reached those conclusions earlier or more reliably, without the benefit of epistemic luck, which I’ve had a lot of. I think the policy that pops out isn’t actually majoritarianism so much as harboring a deep respect for highly evolved institutions, a la Nick Szabo. There’s also Chesterton’s idea of orthodoxy as democracy spread over time. On matters where there’s little reason to expect great advancement of the moderns over older cultures, like in spirituality or morality, it would be foolish to adopt a modern-majoritarian position that ignored the opinions of those older cultures. I don’t actually have all that much respect for the “average person”, but I do have great respect for the pious and the intellectually humble. I honestly see more rationality in the humble creationist than in the protypical yay-science boo-religion liberal.
Though I think my actually converting is getting less likely the more I think about the issue and study recent Church history.
More due to typical negative symptoms and auditory hallucinations and so on most prominent about six months ago, among a few other reasons. But perhaps it’s more accurate to characterize myself as schizotypal.