Heuristics and Biases are the ways human reasoning differs from a theoretical ideal agent, due to reasoning shortcuts that don’t always work (heuristics) and systematic errors (biases).
See also: Affect Heuristic, Confirmation Bias, Fallacies, Predictably Wrong, Rationality, Your Intuitions Are Not Magic, Bias, Heuristic
“Cognitive biases” are those obstacles to truth which are produced, not by the cost of information, nor by limited computing power, but by the shape of our own mental machinery. For example, our mental processes might be evolutionarily adapted to specifically believe some things that arent true, so that we could win political arguments in a tribal context. Or the mental machinery might be adapted not to particularly care whether something is true, such as when we feel the urge to believe what others believe to get along socially. Or the bias may be a side-effect of a useful reasoning heuristic. The availability heuristic is not itself a bias, but it gives rise to them; the machinery uses an algorithm (give things more evidential weight if they come to mind more readily) that does some good cognitive work but also produces systematic errors.
Our brains are doing something wrong, and after a lot of experimentation and/or heavy thinking, someone identifies the problem verbally and concretely; then we call it a “(cognitive) bias.” Not to be confused with the colloquial “that person is biased,” which just means “that person has a skewed or prejudiced attitude toward something.”
A bias is an obstacle to our goal of obtaining truth, and thus in our way.
We are here to pursue the great human quest for truth: for we have desperate need of the knowledge, and besides, we’re curious. To this end let us strive to overcome whatever obstacles lie in our way, whether we call them “biases” or not.
It’s also useful to know the kinds of faults human brains are prone to, in the same way it’s useful to know that your car’s brakes are a little gummy (so you don’t sail through a red light and into an 18-wheeler).
The Sequence, Predictably Wrong, offers an excellent introduction to the topic for those who are not familiar.
Wait a minute… fallacies, biases, heuristics… what’s the difference??
While a bias is always wrong, a heuristic is just a shortcut which may or may not give you an accurate answer. Just because you know complex mathematical methods for precisely calculating the flight of objects through space doesn’t mean you should be using them to play volleyball. Which is to say, heuristics are necessary for actually getting anything done. But because they are just approximations they frequently produce biases, which is where the problem lies. “Fallacy” is often used to mean a very similar thing as bias on LessWrong. [Needs better clarification]
A good example of a heuristic is the affect heuristic—people tend to guess unknown traits about people or things based on the perceived goodness of badness of known traits. In some circumstances this is a useful shortcut—you may like to assume, for instance, that people who are good singers are more likely to be good dancers, too. However, it also frequently produces (unconscious) biases—a bias towards believing that people who are tall and good looking have better moral character, for instance.
So if I learn all the biases, I can conquer the world with my superior intellect?
Well, no. If it were that easy we wouldn’t need a community initially dedicated to overcoming bias (the name of the blog which this website grew out of). Unfortunately, learning about a bias alone doesn’t seem to improve your ability to avoid it in real life. There’s also the (major) issue that knowing about biases can hurt people. Instead of being purely focused on removing negative habits, there is now a major focus at LessWrong to implementing positive habits. These are skills such as how to update (change your mind) the correct amount in response to evidence, how to resolve disagreements with others, how to introspect, and many more.