I think what V_V is saying is: show me the evidence.
Show me the evidence that CFAR workshop participants make better decisions after the workshop than they do before.
Show me the evidence that the impact of CFAR instruction has higher expected humanitarian benefit dollar-for-dollar than an equivalent donation to SENS, or pick-your-favorite-charity.
Show me the evidence that the impact of CFAR instruction has higher expected humanitarian benefit dollar-for-dollar than an equivalent donation to SENS, or pick-your-favorite-charity.
I don’t think they do, but I don’t think we were comparing CFAR to SENS or other effective altruist endorsed charities, I was contesting the claim that CFAR was comparable to religions and mumbo jumbo:
Is it correct to compare CFAR with religions and mumbo jumbo?
I think it is.
I mean, they’re literally basing their curricula on cognitive science. If you look at their FAQ, they give examples of the kinds of scientifically grounded, evidence based methods they use for improving rationality:
While research on cognitive biases has been booming for decades, we’ve spent more time identifying biases than coming up with ways to evade them.
There are a handful of simple techniques that have been repeatedly shown to help people make better decisions. “Consider the opposite” is a name for the habit of asking oneself, “Is there any reason why my initial view might be wrong?” That simple, general habit has been shown to be useful in combating a wide variety of biases, including overconfidence, hindsight biases, confirmation bias, and anchoring effects [see Arkes, 1991; Arkes, Fault, Guilmette, & Hart, 1988; Koehler, 1994; Koriat, Lichtenstein, & Fischhoff, 1980; Larrick, 2004; Mussweiler, Strack, & Pfeiffer, 2000]
Most of us sometimes fall prey to the planning fallacy, where we underestimate the amount of time it’s going to take us to complete a project. But one strategy that’s been shown to work, and which we teach in our workshops, is “reference class forecasting,” which entails asking yourself how it’s taken you, or people you know, to complete similar tasks. [see Buehler, Griffin, & Ross (2002)
A third technique which has strong empirical backing (though it is not typically classified under “de-biasing” research”) is cognitive therapy, which has successfully improved participants’ depression and anxiety by the use of rational thinking habits like asking oneself, “What evidence do I have for that assumption?” Cognitive therapy in particular is an encouraging demonstration that simple rational thinking techniques can become automatic and regularly used, to great effect.
So I just don’t know where someone can be coming from when they suggest that CFAR’s methods are comparably grounded in evidence to religion and mumbo jumbo, when they’re literally grounding their methods on evidence as a rule and religion and mumbo jumbo are grounded on other things, like universal quantum consciousness and gods.
I think what V_V is saying is: show me the evidence.
Show me the evidence that CFAR workshop participants make better decisions after the workshop than they do before.
Show me the evidence that the impact of CFAR instruction has higher expected humanitarian benefit dollar-for-dollar than an equivalent donation to SENS, or pick-your-favorite-charity.
I don’t think they do, but I don’t think we were comparing CFAR to SENS or other effective altruist endorsed charities, I was contesting the claim that CFAR was comparable to religions and mumbo jumbo:
I mean, they’re literally basing their curricula on cognitive science. If you look at their FAQ, they give examples of the kinds of scientifically grounded, evidence based methods they use for improving rationality:
So I just don’t know where someone can be coming from when they suggest that CFAR’s methods are comparably grounded in evidence to religion and mumbo jumbo, when they’re literally grounding their methods on evidence as a rule and religion and mumbo jumbo are grounded on other things, like universal quantum consciousness and gods.