Hello, I’m the person who said Double Crux seems like an attempt to solve a problem that almost never happens. More specifically, the disagreements I see happening between reasonable people are almost always either too easy or too hard for Double Crux to be useful.
On questions like “what is the longitude of Tokyo” or “who starred in the original Star Wars,” two people could agree that looking up the answer on Wikipedia would convince both of them, which would technically fulfill the formal rules of Double Crux, but that hardly seems like a special “rationality technique” or something CFAR can take credit for inventing.
On the other hand, on a question that hinges on value differences like your examples, I can see one of three things happening: either the disputants compromise their honesty by agreeing on a crux which appears relevant but isn’t actually connected to the real motivations behind their disagreement (“if spanking is statistically correlated with a decrease in lifetime earnings, p<0.05, then it is bad, otherwise it is good”), or they maintain their honesty but commit themselves to solving longstanding open problems in metaethics and/or changing genetically mediated personality differences through verbal argument, or they end up using other negotiation techniques and falsely calling it Double Crux.
Double Crux does seem applicable to questions where the answer can’t simply be looked up, where the disagreement is strictly confined to the empirical level and doesn’t touch on value differences or epistemological questions in any way, yet also where the evidence is ambiguous enough to allow for reasonable disagreement. But those are rare in my experience.
I note there’s something in here that I’m reading as a pseudofallacy—it’s the same reason why Mythbusters is terrible, and it goes like “I can only think of these three outcomes, and therefore those are the most likely outcomes.”
This thread and the original Double Crux thread on LessWrong (plus the ~1000 or so CFAR alumni) are full of people saying that Double Crux does indeed work to solve discourse problems that crop up a lot.
That absolutely does not erase your personal experience of a) not seeing those problems and b) not seeing Double Crux solve them. Your personal experience is valid and real and definitely counts as data.
But there’s a particular sort of … audacity? … in taking one’s own, personal experience, and using it to trump the experiences of others, and concluding with fairly strong confidence “this thing that a lot of smart people say is useful just isn’t.”
In your shoes, I’d say something like what I said in my Focusing post, which is “this thing that is useful for a lot of people isn’t useful for me or the people around me.” That seems more solidly justified and epistemically sound, and enriches an onlooker’s understanding of the situation rather than creating crosswise narratives.
In particular, as I tried to do with Focusing, I’d make a genuine attempt to learn Double Crux (from the people who know what they’re talking about and can point out your mistakes and scaffold your understanding) before writing it off. I weakly predict that you haven’t done A + B + C where A is attend a CFAR workshop or one of their Double Crux instruction sessions at e.g. EA Global, B is talk directly to somebody who’s skilled in Double Crux and ask them to help you overcome the standard failure modes, and C is go out and really actually try to follow the real actual steps for five very different sorts of disagreements with real actual humans.
(By the way, it’s completely fine to have not done A + B + C. People have higher priorities. But I personally think that in a rationalist community like Less Wrong, we have a responsibility to not claim things are false or useless or stupid until we’ve actually attempted to falsify them, not just scanned through our own experiences for confirming evidence. If I were in your shoes and I didn’t think Double Crux was useful and I also didn’t intend to do A + B + C, I’d caveat my suspicions of its relative uselessness heavily by pointing out that I was using Stereotypes rather than Rigor, and I want people on Less Wrong to call for and socially reinforce that sort of standard.)
Will probably add that to my list of posts to write this month.
Also, am willing to do the thing that’s been suggested over and over in this thread, and do a Double Crux with you on the usefulness/uselessness of Double Crux, including doing the motions unilaterally while you do whatever you feel like. I could use more practice with Double Cruxing in a not-fully-cooperative environment, since it seems like a plurality of the important debates happen with people who aren’t willing to enter the Double Crux frame anyway.
You accuse me of using Stereotypes rather than Rigor, but I in turn accuse you of using Social Proof rather than Rigor, which I consider far more dangerous, because it leads to self-reinforcing information cascades. By reflexively characterizing all skepticism as hostile, you further reinforce this dynamic by creating a with-us-or-against-us atmosphere.
Yes, I don’t actually believe that ~1000 or so CFAR alumni self-reports represent enough evidence to overturn my initial opinion. There are also many thousands of smart people, including even ones with medical degrees, who endorse homeopathy, but I wonder if you would as forcefully reject a similar Stereotype-based dismissal of that. I’d be very happy to see some real rigor, but I’m not aware of any such from CFAR that I would actually trust to bring back a negative result if the same procedure were used on homeopathy enthusiasts. (And by the way, in 2014 Anna Salamon said CFAR was “supposed to be doing better science later,” meaning better than self-reports and personal impressions. How much later is later?)
I never gave any indication that my comment represented anything but my own personal impression, or that it somehow trumps the experiences of others. But I’m going to keep pointing out that I see the emperor wearing fewer clothes than he claims for as long as I continue to see it that way, and I consider this to be an explicitly prosocial act. I don’t gain anything personally by this, and these contentious posts are actually fairly stressful for me to write, but I consider it worth it to try to push back against your open advocacy of credulousness and protect a rationalist community like Less Wrong from evaporative cooling.
I have not in fact attended a CFAR workshop and don’t intend to, for reasons that might get me in trouble with the “Sunshine Regiment” if I were to explain, but I have read the posts explaining Double Crux and have even found it useful once or twice. I’m happy to try it with you if you’d like.
I disagree with your claim that I “reflexively characterized all skepticism as hostile.” I have reread my own comment and I do not think that’s a fair or accurate synopsis.
I believe you are overstating your claim that “there are also many thousands of smart people, including even ones with medical degrees, who endorse homeopathy” and disagree with the attempt to draw an equivalency there (I both do not think the situations are analogous and don’t think you could actually find thousands of people in the intersection of “smart” and “endorses homeopathy”).
My main point is that it looks to me like you are skeptical of everything but your own impressions, and that Less Wrong should be the sort of place where people actually take heuristics and biases literature seriously, and take the Sequences seriously, and are aware of how fallible their own thinking and impression-making mechanisms are, and how likely it is that they’re being influenced by metacognitive blindspots, and take deliberate and visible steps to compensate for all of that by practicing calibration, using reference class forecasting, taking the outside view, making concrete predictions, seeking falsification rather than confirmation, etc. etc. etc.
In short, I wasn’t asking you to be less skeptical, I was asking you to add one more person to your list of people you’re skeptical of—yourself.
I’m attempting to point out that your claim “Double Crux seems like an attempt to solve a problem that almost never happens” seems to have been outright falsified—even if your homeopathy analogy holds, homeopaths aren’t necessarily hypochondriacs, and I would trust the reports of homeopaths who are saying “I am experiencing this-or-that physiological distress which requires some form of treatment” or “I am having this-or-that medical problem which is lowering my quality of life” without reference to their thoughts on what would fix it. It does not seem that you are updating away from “the problems that Double Crux purports to solve are rare” and toward “those problems are rare in my experience but reliably common for large numbers of people.”
I’m attempting to point out that your statement “I can see one of three things happening” was made in such a way as to imply that there are no other likely things that might happen, and that you’re considering your ability to generate hypotheses or scenarios or predictions to be likely sufficient and near-complete. It’s like when Myth Busters say “Well, we failed to recreate claim X, and therefore claim X is impossible!” That whole paragraph was setting up strawmen and false dichotomies and ignoring giant swaths of possibility.
I didn’t feel like you really addressed any of the thrust of my previous reply, which was something like “If I, clone of saturn, were wrong about Double Crux, how would I know? Where would I look to find the data that would disconfirm my impressions?”
It does not look, based on your comments thus far, like you’re sincerely asking that question. Again, that’s fine—it could simply be that it’s not worth your time. Or it could be that you’re asking that question and I just haven’t noticed yet, and that’s fine because it’s in no way your job to appease some rando on the internet, and my endorsement is not your goal.
But the issue I have, at least, has nothing to do with your opinion on Double Crux. It has to do with the public impression you’re leaving, of how you’re forming and informing it. You’re laying claim to explicitly prosocial behavior on the basis of continuing skepticism, and I simply don’t believe you’re living up to the ideals you think you are. I think Less Wrong has (or ought have) a higher standard than the one you’re visibly meeting. The difference between solving the Emperor’s Clothes problem and just being a contrarian is evidence and sound argument.
Is this ad hominem? Reasonable people could say that clone of saturn values ~1000 self-reports way too little. However it is not reasonable to claim that he is not at all skeptical of himself, and not aware of his biases and blind spots, and is just a contrarian.
“If I, clone of saturn, were wrong about Double Crux, how would I know? Where would I look to find the data that would disconfirm my impressions?”
Personally, I would go to a post about Double Crux, and ask for examples of it actually working (as Said Achmiz did). Alternatively, I would list the specific concerns I have about Double Crux, and hope for constructive counterarguments (as clone of saturn did). Seeing that neither of these approaches generated any evidence, I would deduce that my impressions were right.
I suspect I’m already being more confrontational than you’d prefer, and I don’t want to further wear out my welcome, or take the risk of causing unnecessary friction, by bringing up any other potentially negative points not directly related to CFAR’s rationality content or Double Crux. Should I take it that I was being unnecessarily cautious?
Hello, I’m the person who said Double Crux seems like an attempt to solve a problem that almost never happens. More specifically, the disagreements I see happening between reasonable people are almost always either too easy or too hard for Double Crux to be useful.
On questions like “what is the longitude of Tokyo” or “who starred in the original Star Wars,” two people could agree that looking up the answer on Wikipedia would convince both of them, which would technically fulfill the formal rules of Double Crux, but that hardly seems like a special “rationality technique” or something CFAR can take credit for inventing.
On the other hand, on a question that hinges on value differences like your examples, I can see one of three things happening: either the disputants compromise their honesty by agreeing on a crux which appears relevant but isn’t actually connected to the real motivations behind their disagreement (“if spanking is statistically correlated with a decrease in lifetime earnings, p<0.05, then it is bad, otherwise it is good”), or they maintain their honesty but commit themselves to solving longstanding open problems in metaethics and/or changing genetically mediated personality differences through verbal argument, or they end up using other negotiation techniques and falsely calling it Double Crux.
Double Crux does seem applicable to questions where the answer can’t simply be looked up, where the disagreement is strictly confined to the empirical level and doesn’t touch on value differences or epistemological questions in any way, yet also where the evidence is ambiguous enough to allow for reasonable disagreement. But those are rare in my experience.
I note there’s something in here that I’m reading as a pseudofallacy—it’s the same reason why Mythbusters is terrible, and it goes like “I can only think of these three outcomes, and therefore those are the most likely outcomes.”
This thread and the original Double Crux thread on LessWrong (plus the ~1000 or so CFAR alumni) are full of people saying that Double Crux does indeed work to solve discourse problems that crop up a lot.
That absolutely does not erase your personal experience of a) not seeing those problems and b) not seeing Double Crux solve them. Your personal experience is valid and real and definitely counts as data.
But there’s a particular sort of … audacity? … in taking one’s own, personal experience, and using it to trump the experiences of others, and concluding with fairly strong confidence “this thing that a lot of smart people say is useful just isn’t.”
In your shoes, I’d say something like what I said in my Focusing post, which is “this thing that is useful for a lot of people isn’t useful for me or the people around me.” That seems more solidly justified and epistemically sound, and enriches an onlooker’s understanding of the situation rather than creating crosswise narratives.
In particular, as I tried to do with Focusing, I’d make a genuine attempt to learn Double Crux (from the people who know what they’re talking about and can point out your mistakes and scaffold your understanding) before writing it off. I weakly predict that you haven’t done A + B + C where A is attend a CFAR workshop or one of their Double Crux instruction sessions at e.g. EA Global, B is talk directly to somebody who’s skilled in Double Crux and ask them to help you overcome the standard failure modes, and C is go out and really actually try to follow the real actual steps for five very different sorts of disagreements with real actual humans.
(By the way, it’s completely fine to have not done A + B + C. People have higher priorities. But I personally think that in a rationalist community like Less Wrong, we have a responsibility to not claim things are false or useless or stupid until we’ve actually attempted to falsify them, not just scanned through our own experiences for confirming evidence. If I were in your shoes and I didn’t think Double Crux was useful and I also didn’t intend to do A + B + C, I’d caveat my suspicions of its relative uselessness heavily by pointing out that I was using Stereotypes rather than Rigor, and I want people on Less Wrong to call for and socially reinforce that sort of standard.)
Will probably add that to my list of posts to write this month.
Also, am willing to do the thing that’s been suggested over and over in this thread, and do a Double Crux with you on the usefulness/uselessness of Double Crux, including doing the motions unilaterally while you do whatever you feel like. I could use more practice with Double Cruxing in a not-fully-cooperative environment, since it seems like a plurality of the important debates happen with people who aren’t willing to enter the Double Crux frame anyway.
You accuse me of using Stereotypes rather than Rigor, but I in turn accuse you of using Social Proof rather than Rigor, which I consider far more dangerous, because it leads to self-reinforcing information cascades. By reflexively characterizing all skepticism as hostile, you further reinforce this dynamic by creating a with-us-or-against-us atmosphere.
Yes, I don’t actually believe that ~1000 or so CFAR alumni self-reports represent enough evidence to overturn my initial opinion. There are also many thousands of smart people, including even ones with medical degrees, who endorse homeopathy, but I wonder if you would as forcefully reject a similar Stereotype-based dismissal of that. I’d be very happy to see some real rigor, but I’m not aware of any such from CFAR that I would actually trust to bring back a negative result if the same procedure were used on homeopathy enthusiasts. (And by the way, in 2014 Anna Salamon said CFAR was “supposed to be doing better science later,” meaning better than self-reports and personal impressions. How much later is later?)
I never gave any indication that my comment represented anything but my own personal impression, or that it somehow trumps the experiences of others. But I’m going to keep pointing out that I see the emperor wearing fewer clothes than he claims for as long as I continue to see it that way, and I consider this to be an explicitly prosocial act. I don’t gain anything personally by this, and these contentious posts are actually fairly stressful for me to write, but I consider it worth it to try to push back against your open advocacy of credulousness and protect a rationalist community like Less Wrong from evaporative cooling.
I have not in fact attended a CFAR workshop and don’t intend to, for reasons that might get me in trouble with the “Sunshine Regiment” if I were to explain, but I have read the posts explaining Double Crux and have even found it useful once or twice. I’m happy to try it with you if you’d like.
I disagree with your claim that I “reflexively characterized all skepticism as hostile.” I have reread my own comment and I do not think that’s a fair or accurate synopsis.
I believe you are overstating your claim that “there are also many thousands of smart people, including even ones with medical degrees, who endorse homeopathy” and disagree with the attempt to draw an equivalency there (I both do not think the situations are analogous and don’t think you could actually find thousands of people in the intersection of “smart” and “endorses homeopathy”).
My main point is that it looks to me like you are skeptical of everything but your own impressions, and that Less Wrong should be the sort of place where people actually take heuristics and biases literature seriously, and take the Sequences seriously, and are aware of how fallible their own thinking and impression-making mechanisms are, and how likely it is that they’re being influenced by metacognitive blindspots, and take deliberate and visible steps to compensate for all of that by practicing calibration, using reference class forecasting, taking the outside view, making concrete predictions, seeking falsification rather than confirmation, etc. etc. etc.
In short, I wasn’t asking you to be less skeptical, I was asking you to add one more person to your list of people you’re skeptical of—yourself.
I’m attempting to point out that your claim “Double Crux seems like an attempt to solve a problem that almost never happens” seems to have been outright falsified—even if your homeopathy analogy holds, homeopaths aren’t necessarily hypochondriacs, and I would trust the reports of homeopaths who are saying “I am experiencing this-or-that physiological distress which requires some form of treatment” or “I am having this-or-that medical problem which is lowering my quality of life” without reference to their thoughts on what would fix it. It does not seem that you are updating away from “the problems that Double Crux purports to solve are rare” and toward “those problems are rare in my experience but reliably common for large numbers of people.”
I’m attempting to point out that your statement “I can see one of three things happening” was made in such a way as to imply that there are no other likely things that might happen, and that you’re considering your ability to generate hypotheses or scenarios or predictions to be likely sufficient and near-complete. It’s like when Myth Busters say “Well, we failed to recreate claim X, and therefore claim X is impossible!” That whole paragraph was setting up strawmen and false dichotomies and ignoring giant swaths of possibility.
I didn’t feel like you really addressed any of the thrust of my previous reply, which was something like “If I, clone of saturn, were wrong about Double Crux, how would I know? Where would I look to find the data that would disconfirm my impressions?”
It does not look, based on your comments thus far, like you’re sincerely asking that question. Again, that’s fine—it could simply be that it’s not worth your time. Or it could be that you’re asking that question and I just haven’t noticed yet, and that’s fine because it’s in no way your job to appease some rando on the internet, and my endorsement is not your goal.
But the issue I have, at least, has nothing to do with your opinion on Double Crux. It has to do with the public impression you’re leaving, of how you’re forming and informing it. You’re laying claim to explicitly prosocial behavior on the basis of continuing skepticism, and I simply don’t believe you’re living up to the ideals you think you are. I think Less Wrong has (or ought have) a higher standard than the one you’re visibly meeting. The difference between solving the Emperor’s Clothes problem and just being a contrarian is evidence and sound argument.
Is this ad hominem? Reasonable people could say that clone of saturn values ~1000 self-reports way too little. However it is not reasonable to claim that he is not at all skeptical of himself, and not aware of his biases and blind spots, and is just a contrarian.
Personally, I would go to a post about Double Crux, and ask for examples of it actually working (as Said Achmiz did). Alternatively, I would list the specific concerns I have about Double Crux, and hope for constructive counterarguments (as clone of saturn did). Seeing that neither of these approaches generated any evidence, I would deduce that my impressions were right.
What makes you think describing why you personally won’t go to a workshop would get you in trouble?
I suspect I’m already being more confrontational than you’d prefer, and I don’t want to further wear out my welcome, or take the risk of causing unnecessary friction, by bringing up any other potentially negative points not directly related to CFAR’s rationality content or Double Crux. Should I take it that I was being unnecessarily cautious?