You, Your Biases, and The Undoing Project

Consider this description.

“Linda is 31 years old, single, outspoken and very bright.  She majored in philosophy.  As a student, she was deeply concerned with issues of discrimination and social justice, and also participated in anti-nuclear demonstrations.”

Which of the below descriptions do you think is more likely?
1. Linda is a bank teller
2. Linda is a bank teller and is active in the feminist movement

If you selected number 2 – that Linda is a bank teller and is active in the feminist movement – then you would be like the vast majority of people, regardless of their education level. You would also be wrong.

If you look at the options without thinking too much about the description that preceded it, you would quickly see that the idea that option 2 could be more probable than option 1 is totally illogical.  Expressed as a Venn Diagram, “Linda as a Bank Teller” is a big, huge circle, and “Linda as a Bank Teller and an active feminist” could only be a small circle within it.  It’s impossible for number 2 to be more probable than option 1.

This common mistake is one of many cognitive quirks we humans have that cause us to make errors in judgement.  This particular one is an example of the Representative Heuristic, where we overcompensate for some random fact that causes us to make a mental shortcut to a destination that seems likely, but is wrong. Like many groundbreaking cognitive bias insights, this one was the work of the psychology world’s dynamic duo: Amos Tversky and Daniel Kahneman, whose years of collaboration are the subject of the latest book by Michael Lewis, The Undoing Project.


I probably have read a number of articles that referenced Kahneman and Tversky’s research over the years without really registering their names. However, when Kahneman wrote a book that made much of their academic research accessible for the general interest reader (me!) a few years ago, I took notice. Here’s a video that highlights some of the main themes in Kahneman’s book.

Michael Lewis is one of the rare authors whose written grocery store list I would consider purchasing.  He doesn’t shy away from complicated topics – you might know him from some of his books that have been turned into big movies – Moneyball, The Blind Side, and The Big Short are examples.  His writing is smart, sharp, occasionally hilarious (I particularly recommend his book “Boomerang”) and fun to read.

Tversky and Kahneman were both odd – something not altogether uncommon among the brilliant. Each was considered an intellectual giant by the academics around them and were prone to asking annoying questions of the two-year-old variety:  “why”?  Their pairing would not have been obvious, but each brought something the other lacked and needed.

Many of their insights have become so known that they seem common knowledge at this point. We all understand that a pollster can get radically different results simply by how a question is “framed”.  We also know that our minds default to information that is most “available” to us, and in the process make stupid mistakes.  Here’s an example of how Daniel and Amos demonstrated this “Availability Heuristic” at work (from Lewis’s book, as was the “Linda” example above):

“In four pages of a novel (about 2,000 words), how many words would you expect to find that would have the form of _ _ _ _ – ing (seven-letter words that end with “ing’)?  Indicate your best estimate by circling one of the values below:

0   1-2   3-4   5-7   8-10   11-15   16+

Then they put to those same people a second question:  How many seven-letter words appeared, in that same text, of the form of _ _ _ _ _ _ n _?  Of course (of course!) there had to be at least as many seven-letter words with n in the sixth position as there were severn-letter words ending in ing, as the latter was just one example of the former.  People didn’t realize that, however.  They guessed, on average, that the 2,000-word text contained 13.4 words ending in ing and only 4.7 words with n in the sixth position.  And they did this, Amos and Danny argue, because it was easier to think of words ending in “ing”.  Those words were more available.  People’s misjudgment of the problem was simply the availability heuristic in action.”

Biases such as availability, representativeness, and confirmation (where we focus only on data that confirms our previously-held beliefs, rather than data that contradicts those beliefs) are clearly at play in today’s discussions about “fake news” and information bubbles.  Over the past few millennia, we humans have also developed a hyper-focus on threats – which was a good thing back when we were naked and living among wild animals, but can be a distraction when we over-inflate the number of airplane crashes or terrorist attacks based upon the news.

As Tversky and Kahneman showed, our minds often find ruts and shortcuts that divert us away from the truth, rather than leading toward it.  Being aware of these biases take constant vigilance.

Good luck!