Is it correct to compare CFAR with religions and mumbo jumbo?
I think it is. CFAR could be just a more sophisticated type of mumbo jumbo tailored to appeal materialists. Just because they are not talking about gods or universal quantum consciousness it doesn’t mean that their approach is any more grounded in evidence. Maybe it is, but I would like to see some replicable study about it. I’m not going to give them a free pass because they display the correct tribal insignia.
Just because they are not talking about gods or universal quantum consciousness it doesn’t mean that their approach is any more grounded in evidence.
This comment confuses the heck out of me. Of course that’s not why their approach is any more grounded in evidence, the fact that the particular brand of self-help they’re peddling is specifically about rationality which is itself to a large degree about what is grounded in evidence is why the methods they’re promoting are grounded in evidence.
Maybe it is, but I would like to see some replicable study about it.
What are you looking for, exactly? The cognitive biases that CFAR teaches about and tries to help mitigate have been well known in social psychology and cognitive science for decades.
If you’re worried that trying to tackle these kinds of biases could actually makes things worse, then yes, we know that knowing about biases can hurt people, and obviously that’s something CFAR tries to avoid. If you’re worried that trying to improve rationality isn’t actually a particularly effective form of self-help, then yes, we know that extreme rationality isn’t that great, and trying to find improvements that result in real practical benefits is part of what CFAR tries to do. But the fact that they’re specifically approaching areas of self-improvement that come from well documented to be genuinely real-world phenomena like cognitive biases makes them clearly significantly different to those that are centred around less sound ideas, for example, gods or universal quantum consciousness. Or at least, I would have thought so anyway.
I think what V_V is saying is: show me the evidence.
Show me the evidence that CFAR workshop participants make better decisions after the workshop than they do before.
Show me the evidence that the impact of CFAR instruction has higher expected humanitarian benefit dollar-for-dollar than an equivalent donation to SENS, or pick-your-favorite-charity.
Show me the evidence that the impact of CFAR instruction has higher expected humanitarian benefit dollar-for-dollar than an equivalent donation to SENS, or pick-your-favorite-charity.
I don’t think they do, but I don’t think we were comparing CFAR to SENS or other effective altruist endorsed charities, I was contesting the claim that CFAR was comparable to religions and mumbo jumbo:
Is it correct to compare CFAR with religions and mumbo jumbo?
I think it is.
I mean, they’re literally basing their curricula on cognitive science. If you look at their FAQ, they give examples of the kinds of scientifically grounded, evidence based methods they use for improving rationality:
While research on cognitive biases has been booming for decades, we’ve spent more time identifying biases than coming up with ways to evade them.
There are a handful of simple techniques that have been repeatedly shown to help people make better decisions. “Consider the opposite” is a name for the habit of asking oneself, “Is there any reason why my initial view might be wrong?” That simple, general habit has been shown to be useful in combating a wide variety of biases, including overconfidence, hindsight biases, confirmation bias, and anchoring effects [see Arkes, 1991; Arkes, Fault, Guilmette, & Hart, 1988; Koehler, 1994; Koriat, Lichtenstein, & Fischhoff, 1980; Larrick, 2004; Mussweiler, Strack, & Pfeiffer, 2000]
Most of us sometimes fall prey to the planning fallacy, where we underestimate the amount of time it’s going to take us to complete a project. But one strategy that’s been shown to work, and which we teach in our workshops, is “reference class forecasting,” which entails asking yourself how it’s taken you, or people you know, to complete similar tasks. [see Buehler, Griffin, & Ross (2002)
A third technique which has strong empirical backing (though it is not typically classified under “de-biasing” research”) is cognitive therapy, which has successfully improved participants’ depression and anxiety by the use of rational thinking habits like asking oneself, “What evidence do I have for that assumption?” Cognitive therapy in particular is an encouraging demonstration that simple rational thinking techniques can become automatic and regularly used, to great effect.
So I just don’t know where someone can be coming from when they suggest that CFAR’s methods are comparably grounded in evidence to religion and mumbo jumbo, when they’re literally grounding their methods on evidence as a rule and religion and mumbo jumbo are grounded on other things, like universal quantum consciousness and gods.
What are you looking for, exactly? The cognitive biases that CFAR teaches about and tries to help mitigate have been well known in social psychology and cognitive science for decades.
It’s also well known in cognitive science that mitigating biases is hard. Having studies that prove that CFAR interventions work is important for the long term.
Keith Stanovich (who’s a CFAR advisor) got a million dollar (or $999,376 to be exact) from the John Templeton Foundation to create a rationality quotient test that measures rationality in the same way we have tests for IQ.
If CFAR workshops work to increase the rationality of their participants, that score should go up.
trying to find improvements that result in real practical benefits is part of what CFAR tries to do.
The fact that you try doesn’t mean that you succeed and various people in the personal development field also try to find improvements that result in real practical benefits.
But the fact that they’re specifically approaching areas of self-improvement that come from well documented to be genuinely real-world phenomena like cognitive biases makes them clearly significantly different to those that are centred around less sound ideas, for example, gods or universal quantum consciousness.
In Willpower psychology professor Roy Baumeister makes the argument that the God idea is useful for raising Willpower.
Mormons have been found to look healtier.
It’s only misleading if you take it out of context.
I do argue in this thread that CFAR is a promising organisation. I didn’t say that CFAR is bad because they haven’t provided this proof.
I wanted to illustrate that meaningful proof of effectiveness is possible and should happen in the next years.
The fact that CFAR is unable to do this till now because of unavailability of the test doesn’t mean that there’s proof that CFAR manages to raise rationality.
I also don’t know the exact relationship between Stanovich and CFAR and to what extend his involvement in CFAR is more than having his name on the CFAR advisor page.
Giving CFAR participants a bunch of questions that he considers to be potentially usefully for measuring rationality could be part of his effort to develop a rationality test.
The text being publically available isn’t a necessary condition for a version of the text being used inside CFAR.
I think it is. CFAR could be just a more sophisticated type of mumbo jumbo tailored to appeal materialists. Just because they are not talking about gods or universal quantum consciousness it doesn’t mean that their approach is any more grounded in evidence.
Maybe it is, but I would like to see some replicable study about it. I’m not going to give them a free pass because they display the correct tribal insignia.
This comment confuses the heck out of me. Of course that’s not why their approach is any more grounded in evidence, the fact that the particular brand of self-help they’re peddling is specifically about rationality which is itself to a large degree about what is grounded in evidence is why the methods they’re promoting are grounded in evidence.
What are you looking for, exactly? The cognitive biases that CFAR teaches about and tries to help mitigate have been well known in social psychology and cognitive science for decades.
If you’re worried that trying to tackle these kinds of biases could actually makes things worse, then yes, we know that knowing about biases can hurt people, and obviously that’s something CFAR tries to avoid. If you’re worried that trying to improve rationality isn’t actually a particularly effective form of self-help, then yes, we know that extreme rationality isn’t that great, and trying to find improvements that result in real practical benefits is part of what CFAR tries to do. But the fact that they’re specifically approaching areas of self-improvement that come from well documented to be genuinely real-world phenomena like cognitive biases makes them clearly significantly different to those that are centred around less sound ideas, for example, gods or universal quantum consciousness. Or at least, I would have thought so anyway.
I think what V_V is saying is: show me the evidence.
Show me the evidence that CFAR workshop participants make better decisions after the workshop than they do before.
Show me the evidence that the impact of CFAR instruction has higher expected humanitarian benefit dollar-for-dollar than an equivalent donation to SENS, or pick-your-favorite-charity.
I don’t think they do, but I don’t think we were comparing CFAR to SENS or other effective altruist endorsed charities, I was contesting the claim that CFAR was comparable to religions and mumbo jumbo:
I mean, they’re literally basing their curricula on cognitive science. If you look at their FAQ, they give examples of the kinds of scientifically grounded, evidence based methods they use for improving rationality:
So I just don’t know where someone can be coming from when they suggest that CFAR’s methods are comparably grounded in evidence to religion and mumbo jumbo, when they’re literally grounding their methods on evidence as a rule and religion and mumbo jumbo are grounded on other things, like universal quantum consciousness and gods.
It’s also well known in cognitive science that mitigating biases is hard. Having studies that prove that CFAR interventions work is important for the long term.
Keith Stanovich (who’s a CFAR advisor) got a million dollar (or $999,376 to be exact) from the John Templeton Foundation to create a rationality quotient test that measures rationality in the same way we have tests for IQ.
If CFAR workshops work to increase the rationality of their participants, that score should go up.
The fact that you try doesn’t mean that you succeed and various people in the personal development field also try to find improvements that result in real practical benefits.
In Willpower psychology professor Roy Baumeister makes the argument that the God idea is useful for raising Willpower. Mormons have been found to look healtier.
That is an extremely misleading sentence. CFAR cannot give Stanovich’s test to their students because the test does not yet exist.
It’s only misleading if you take it out of context.
I do argue in this thread that CFAR is a promising organisation. I didn’t say that CFAR is bad because they haven’t provided this proof.
I wanted to illustrate that meaningful proof of effectiveness is possible and should happen in the next years.
The fact that CFAR is unable to do this till now because of unavailability of the test doesn’t mean that there’s proof that CFAR manages to raise rationality.
I also don’t know the exact relationship between Stanovich and CFAR and to what extend his involvement in CFAR is more than having his name on the CFAR advisor page. Giving CFAR participants a bunch of questions that he considers to be potentially usefully for measuring rationality could be part of his effort to develop a rationality test.
The text being publically available isn’t a necessary condition for a version of the text being used inside CFAR.