Thanks for writing this, this is an interesting area, and improving decision
making is a worthwhile goal.
However, what I am a bit skeptical about is the extent to which people want
to improve the decision making process. In LW-circles I can see that desire,
but in the world at large it seems more important to win the argument than
necessarily be right; there is some more work to do before they would even
want to use the tools.
If we focus on LW-readers which are (hopefully) more interested in truth
seeking, it would be interesting to see if there has been a discussion where
one of these tools could really have made a difference. Personally, I usually
don’t have so much trouble following the causal steps—the difficulties lie
usually in missing the background knowledge to weigh the arguments.
As a programmer I know the urge to come up with some solution for a problem,
preferably in the form of some algorithm. But is there evidence that these
tools actually help in realistic situations?
Thanks for writing this, this is an interesting area, and improving decision making is a worthwhile goal.
However, what I am a bit skeptical about is the extent to which people want to improve the decision making process. In LW-circles I can see that desire, but in the world at large it seems more important to win the argument than necessarily be right; there is some more work to do before they would even want to use the tools.
If we focus on LW-readers which are (hopefully) more interested in truth seeking, it would be interesting to see if there has been a discussion where one of these tools could really have made a difference. Personally, I usually don’t have so much trouble following the causal steps—the difficulties lie usually in missing the background knowledge to weigh the arguments.
As a programmer I know the urge to come up with some solution for a problem, preferably in the form of some algorithm. But is there evidence that these tools actually help in realistic situations?