Be More Effective by Letting Your Cognitive Bias Run Free

A cognitive bias is a recurring illogical pattern of thought or judgment that appears in most people. That's a fancy way of saying it's something that people always seem to think about the wrong way, or a common mistake. An easy cognitive bias to observe is “the bandwagon effect”, which is believing something is reasonable or true simply because many other people think that it's reasonable or true.

There is a strong, but rarely explicitly made, suggestion regarding cognitive biases: we should attempt to eliminate them from our thinking. On the surface this seems like a very reasonable idea. After all, biases are illogical and lead to incorrect results. I have grave concerns about this approach to cognition because, in my experience, the correct result is not always the optimum one. For example, the perception of terrain slopes is seen as much steeper than it is measured. It's a well documented cognitive bias1, and it's also present for a good reason. Walking up even moderate slopes is generally a bad idea, the bias keeps us safe.

Before we think of a particular cognitive bias as a bad thing or try to correct it, it's worth applying the principal of Chesterton's Fence ... let us say, for the sake of simplicity, a fence or gate erected across a road. The more modern type of reformer goes gaily up to it and says, “I don’t see the use of this; let us clear it away.” To which the more intelligent type of reformer will do well to answer: “If you don’t see the use of it, I certainly won’t let you clear it away. Go away and think. Then, when you can come back and tell me that you do see the use of it, I may allow you to destroy it.2

Human minds are very effectively evolved. Medicine has slowly learned that cutting out seemingly unused organs, such as the appendix 3, and the post-menopausal uterus4 is a bad idea. Our parts play helpful roles, even while they may also cause trouble. There is a pretty strong indication that we have a cultural bias towards fiddling with our bodies and natures.5 I think that the vast majority of our biases, like organs, are there for good reason and that we owe them the same diligence as Chesterton's fence before attempting to correct them.

One example that has always bothered me is the Stanford marshmallow experiment. In this experiment, a child was offered a choice between one small reward provided immediately or two small rewards if they waited for a short period, approximately 15 minutes, during which the tester left the room and then returned. (The reward was sometimes a marshmallow, but often a cookie or a pretzel.) In follow-up studies, the researchers found that children who were able to wait longer for the preferred rewards tended to have better life outcomes, as measured by SAT scores, educational attainment, body mass index (BMI), and other life measures. 6

This is a pretty straightforward case of Hyperbolic discounting7 which is preferring things that happen in the present to things that happen in the future. The question, of course, is “why do we have this bias?” In this case, there is a pretty obvious answer: the future is uncertain and unreliable. We have the old adage “A bird in hand is worth two in the bush” wired into us biologically because the future is always uncertain.

What's interesting is that in a follow-up study, we learn that children who are given hints that the second marshmallow may be unreliably provided are much less likely to wait for it.8 Hyperbolic discounting actually scales itself with how trustworthy the experimenters appear to be. When experimenters are not artificially manipulating their displayed trustworthiness, I suspect, the overwhelming determinant of how much hyperbolic discounting the child displays is how trustworthy his environment has been in the past, specifically with regards to the fulfillment of promises made by adults. I hypothesize a strong correlation between children being given a dependable, and structured environment, and children being given legitimate opportunities for success. Regardless of if the promise of ice cream went unfulfilled because of poor communication, lack of respect for the child, neglect, poverty, or a sudden emergency there is a correlation that can be drawn with negative life outcomes. Parents who break promises to their children are either providing, or part of, a broader environment that is likely to correlate with worse life outcomes. 9 An interesting variant on the marshmallow study would be to investigate how well the child's performance correlates with their parents already having started a college savings account for them.

A second unfairly maligned cognitive bias is “the bandwagon effect”, which suggests that one should not simply follow the crowd. 10. The thing is, most of your friends probably exercise judgment you respect. If they do something, even something that doesn't make obvious sense to you, there is a good chance that they have a compelling reason . It goes farther than that though, there is a well studied concept called “the wisdom of crowds” 11 in which groups seem to display a form of emergent intelligence on most subjects that is better than that of experts. It appears we are biologically wired to take advantage of this sort of emergent intelligence.

Fascinatingly, the bandwagon effect tells most of us exactly how to set our hyperbolic discounting bias. If you live in a developed country right now the message from the crowd is “About 7% per year”. The entire society has decided that a reasonable amount to demand later in return for forgoing value now is about 7% more per year. If you know a reason why the thing might be riskier, you demand more.

There are two other biases that interlock into something interesting, and are key to the nature of motivation. One is “effort shock”12 described as the realization, only after partial completion, that tasks are much harder than you thought they were going to be at the outset. The bigger the task, the more drastically one's estimation typically undershoots the effort required.

Effort Shock is a very real cognitive bias, probably a blend of “the optimist bias” and “the overconfidence effect”. Anyone who professionally estimates the time and cost of projects will tell you that they actively account for it.13 For working on run of the mill stuff, a seasoned pro, who knows the pitfalls, can often correct for this. When working on larger projects, or in unfamiliar territory, estimates are always going to be even more cough optimistic.

What's the up-side of drastically underestimating how hard a new, large project is going to be? Why do we have this bias? I suspect, we have it because it gets us up and moving, involved in projects. A truly clear sighted view of things is horribly demotivating. It helps to correct for a "fear of failure" bias. I know that I, at least, would set out to tackle fewer large goals if I honestly and in my gut felt just how big they were.

The good news is that once effort shock gets us started, and we are a few weeks or months into our project, a second cognitive bias kicks in: “the sunken cost fallacy”14. This bias aids persistence, in general, and many people instinctively try to use it to their advantage. All those people who sign up for a full year gym membership are hoping that having paid already it will invoke “the sunken cost fallacy” to make them want to work out.

This combination of biases works remarkably well for getting things done. If you can get 20% of the way into a project, particularly one that doesn't have good partial returns, on the delusion that you will be done pretty soon, not wanting to waste all that work you put in will carry you farther into the project. Near the beginning of the project it's likely that you are still suffering from a reduced version of Effort Shock ("Well that was pretty hard, I bet I'm 90% of the way there"), and towards the end the sunken cost fallacy becomes very powerful indeed.

There's more good news at the end, too. In addition to effort shock, there is often “reward shock”15, the discovery that what you built is even more valuable than you thought it would be. I'll leave trying to decide if reward shock is a bias at all, and if so, why it might be a useful one, as an exercise to the reader.

Many folks try to eliminate as many cognitive biases as they can16. The implicit model is that there is a Truth with a capital "T" out there, and that it is obscured or distorted through the lens of our mind because of cognitive bias. That if we could just grind down the cognitive biases into a perfect lens we could see the Truth as it really is and all of our problems would be solved. Or, at the very least, we would be able to act optimally.

There is a slight possibility that this is true with specific subsets of science, like physics. It's even conceivable that applying the rule “always polish away cognitive bias” might aid in the pursuit of a very specific goal, like say manufacturing paperclips.17 I'm pretty sure, however, that even a paperclip maximizer would be more effective with some biases. 18

When one thinks about decision making with broad decisions such as “How should I live my life”, or really to any form of goal-setting that isn't just a step in execution, the elimination of biases becomes a truly absurd proposition. A wise man once said 19 about the way the mind works, “It's cognitive bias all the way down”. I'm pretty sure he was right, that the entire thing we call contentiousness, the entire process of thinking, is just a massive interlocked pile of cognitive biases jostling with each other.