Monthly Archives: July 2016

Logical Fallacies vs Cognitive Biases

Because skeptics often talk about cognitive biases alongside logical fallacies and other thinking errors, it’s easy to get the idea that a bias is just another mistake that you can learn to avoid.  This is a false impression that your brain will happily accommodate by ignoring its own biases while recognizing them in other people.  You may then get frustrated at others when you notice them succumbing to biased thinking, which may lead you to view them as ignorant, stupid or lazy.

It’s important to realize a cognitive bias is different from a logical fallacy. With practice, we can learn to recognize and completely avoid mistakes of logic. This is not true of biases. While faults of logic come from how we think, and thus we can simply change our thinking to be more logical, biases arise from the very cognitive machinery that allows us to think. In short, we can’t process information without them.

A cognitive bias is not necessarily a thinking error.  Biases can manifest as a sort of prejudice, but it’s best to think of them as a thinking tendency.  Biases slant our thinking towards certain avenues and conclusions, and often times they are useful.  Biases are a result of the mechanisms the brain uses to help it quickly make sense of information and experiences.  Each of us, everyday, are in situations where quickly making sense of the world is a very useful ability.  But, as with many things in life, quickness comes at the price of quality.  While biases may offer us a quick and useful view of reality, that view will inevitably be distorted and incomplete.   Biases are also subject to our brain’s penchant to be self-serving and over protective.  So, while biases may be practical in many situations, if your situation calls for important decisions, fair assessments or accurate conclusions, they can be a big problem.

What makes biases particularly problematic is their insidiousness.  Because biases are seamlessly ingrained into our cognitive architecture, they often do not feel to us like tendencies or prejudices. On the contrary, they often feel like wisdom or enlightenment. Your brain is set up to allow biases to masquerade as rationality, and as such, personal introspection is useless in helping us recognize them. A cognitive bias is a blind spot in your thinking and, just like the blind spot in your vision, it’s very hard to notice without it being pointed out. Further, once you do notice, there isn’t really much you can do to avoid it. At best, you can try to recognize situations which are likely to trigger biased thinking, you can understand the mistake it is likely to lead to, the information it is preventing you from having, or the perspective it obscures, and attempt to mitigate. This is easier to do with some biases than others, but preventing bias all together is not an option. Being inclined to think that you can avoid a bias because you are aware of it is itself a bias. (bias within bias)

Consider this quote from Daniel Kahneman, who has studied these biases and their effects probably more than any other researcher.  “Except for some effects that I attribute mostly to age, my intuitive thinking is just as prone to overconfidence, extreme predictions, and the planning fallacy as it was before I made a study of these issues. I have improved only in my ability to recognize situations in which errors are likely….And I have made much more progress in recognizing the errors of others than my own.”

So why learn about biases if we cannot prevent them? The answer is simple, to keep yourself from embracing them. While it may be very difficult, if not impossible, to see past our biases, we can avert the urge to run with them and eagerly allow them to distort our view. We can stop ourselves from falling back on them as a defense when our positions are challenged. We do not learn about biases so we can exile them from our thoughts, but so we can recognize them as a source of humility rather than fuel for arrogance.  The more precisely we understand biases the more effectively we can apply critical thinking and skepticism.


Now, all this doesn’t mean we are completely helpless and have no hope of ever clearly understanding the world.  It just means that we have to reach outside of ourselves to do it effectively.  This is one more reason why the scientific method is necessary if we want accurate answers about how reality operates. We need an outside process which attempts to avoid and account for the biases that human intellect cannot. One of the key functions of science is quality control, and this is a primary reason why anecdotal information and personal experience can never be used to trump scientific answers. If you disagree with something science says because your personal experience was different, then the proper channel for that disagreement is more science. Simply put, if you’re not disagreeing with science by using more science, then there is no way to know if your disagreement is due to bias. This may seem unfair, but it is a rule enforced by nature, not by science.

 It’s no coincidence that many effects and phenomena disappear when we view them through the lens of science. It’s because they never existed in the first place, and were merely an artifact of bias.