Cognitive biases are ways we all act irrationally. There's a lot in there, so let's unpack it.
What is rational behaviour?
A lot of people expect rational behaviour to be something you see on Star Trek: cold, emotionless, never falling in love, and being a buzzkill to all things fun. Maybe that’s rational for some people, but that’s not what rational means when economists use it. Rational just means doing whatever is most consistent with your values (which probably isn’t being cold, never falling in love or buzzkilling the fun things). In the prisoner’s dilemma talked about previously, it’s only rational to defect, thereby screwing over your buddy, in the situation where: 1) you’re purely self-interested and don’t give a damn about them and 2) you’re not likely to be in a similar situation again, and 3) there aren’t other constraints (like a mob boss who’ll have you hit if you rat out your friend). These conditions mostly aren’t true in real life.
This brings us to misconception number two: rational does not mean purely self-interested. It just means acting in a way that’s consistent with what you value. If you’re a psychopath and only care about yourself, it would be rational for you to screw over your friends for personal gain. But, most people aren’t psychopaths, so it’s rational to help your friend out.
What are cognitive biases?
The word ‘bias’ in this context has nothing to do with how your fourth grade dance teacher would always cast her daughter in the lead role. A cognitive bias is a systematic error in thinking - part of our brain's hardwiring - that causes us to act repeatedly in an irrational way. Most people are unaware of these subconscious biases but often we're all making the same irrational mistakes because of them.
Some cognitive biases presumably served our hunter-gatherer ancestors well. They likely enabled faster decision-making when speed was more valuable than accuracy, as we saw in the heuristics entry. Biases and heuristics are like two sides of the same coin, and bias carries the more negative connotations of when a heuristic goes wrong.
Examples of cognitive biases
We’ve already seen a few examples in previous weeks, like optimism bias and hindsight bias, but I saved the best for this section.
Confirmation bias is, perhaps, the mother of all biases. It’s our tendency to bury our heads in the sand and selectively interpret information that confirms our prior beliefs.
Conservatism bias is our inclination to stick to our guns too much instead of revising our beliefs when presented with new information.
Repeatedly being exposed to an idea or thing can make us more likely to accept or react favourably to it, without being presented with any additional information (the ‘mere exposure effect’). This is why as we are repeatedly exposed to the same advertisements, we grow more fond of them over time.
Omission bias is the tendency to judge harmful actions as worse, or less moral than equally harmful omissions (inactions). So taking $5 from a kid seems a lot worse than deciding to keep $5 for yourself instead of giving it to him. This bias is also one of the reasons why it can be a good idea to commit to something before you have to. University students seem to find it easier than people who are already earning, to take the Giving What We Can Pledge to donate 10% of their future income to effective charities because they don’t ‘miss’ what they never had. Taking the pledge once you’re used to a certain income seems like a much more drastic sacrifice. See also the Copenhagen Interpretation of Ethics.
Ingroup bias is our tendency to treat people we view as similar to us more favorably and to treat ‘outsiders’ with prejudice. This manifests itself not just as sexism, racism and other prejudices, but also intolerance of, say, political opponents.
Social desirability bias is our tendency to respond in a way that someone wants to hear.
Identifiable victim bias is our irrational tendency to be moved by stories impacting one person, than statistics of a similar effect on a large number of people.
The Peltzman effect describes how we take more risks when we feel more safe. After seat belts were first introduced, motorists actually drove faster and closer to the car in front of them.
The plot thickens: it’s only our friends and colleagues who are biased — me personally, no way! This is called the bias blind spot (where we can point out everyone’s biases except our own) and it’s a way that learning about biases can actually harm someone. With understanding of all these biases, you’ll be tempted to become fallacy (wo)man (but don’t do that!). You might also fall into the trap of better rationalizing the decisions you’ve already made, rather than making better decisions or getting closer to the truth.
And, like, 100s more. Check out this massive list on Wikipedia.