Last week, we talked about planning fallacy, a cognitive bias that renders us unable to correctly estimate our plans. This week, we're taking a look at 6 other cognitive biases that affect how we work and how we react to team members.

A cognitive bias is a deviation in judgement that makes us draw conclusions about other people and situations in an illogical fashion. This happens when people create their own so-called "subjective social reality" based on their own perceptions. In turn, this perception, and not the objective events, form our behaviour in the social world, including the workplace.

Reactive devaluation

Reactive devaluation makes us undervalue a proposal when it is originated by someone we see as an adversary.

It was proposed in 1991, when Lee Ross and Constance Stillinger conducted an experiment by asking pedestrians whether they would stand behind a drastic bilateral nuclear arms reduction program. It turns out that when subjects thought the proposal came from Ronald Reagan, 90% said it would be positive or neutral to the U.S. However, when they thought the proposal came from Mikhail Gorbachev, only 44% thought it was good for the U.S.

Even when subjects thought the proposal came from policy analysts, they were less likely to believe the arms reduction program was favourable than when they thought it came from Ronald Reagan (80% of subjects believed so).

Reactive devaluation makes us judge a proposal based on its originator. And while in most workplaces things are not as intense as The Cold War, reactive devaluation can make us overlook some great ideas. Often, it might not be because we necessarily see the originator as an adversary, but because we're not as amicable with that team member as we are with others. This will naturally make us more open to the ideas of those we see as friends. Pay attention to this bias and its close cousin, confirmation bias.

Bandwagon effect 

The bandwagon effect tells us that beliefs and actions spread among people, making it more probable for an individual to adopt them the more people have already adopted these beliefs. In other words, as more people believe or do something, others are most likely to 'hop on the bandwagon', regardless of the evidence.

Band Wagon Cognitive bias

Solomon Asch's conformity experiments show us that people jump on the bandwagon either because they directly prefer to conform, or because they derive information from others. In his 1951 experiments, Asch had subjects participating in a number of perceptual tasks. However, only one person in the group as a subject, the rest of them being subjects. In some of the tasks, the actors would all give the same wrong answer, thus making the subject feel pressured and question himself (the subject always answered the question last). At the end of the interview, the subject was interviewed, giving us more insight about what happens when someone yields to the group and when someone doesn't.

Asch's results are in fact stunning: in the tasks where the actors wouldn't all give the same answer, the error rate was less than 1%. But in the tasks where all actors would give the same wrong answer, one third of all responses were incorrect, with 75% of participants giving an incorrect answer to at least one question. These incorrect answers often matched the answers of the actors.

The ugly truth is that most people are lazy. We don't want to think for ourselves, and we often think that if someone else has already adopted something, it can't be bad. What happens is that either our perception or our judgment is derailed. This causes an inner conflict that is detrimental to our well-being. In Asch's interviews with the subjects, those who yielded to the group said they were suspicious of the right answer, but tried to forget about it, or were not able to reinstate confidence and go against the majority. On the other hand, the independent subject, although having experienced doubt, was happy and relieved at the end of the experiment.

The bandwagon effect causes us to not stand up for our decisions, for what we believe in. What's more, it actually alters what we believe in. It makes us conform not want to take the road less travelled.  It curbs originality and proclaims death to innovation. It makes us forget why we do the things we do.

Functional fixedness

Functional fixedness is why we can't see objects past their obvious use. It limits us to see that object only in the way it is traditionally used. We see a hammer as an object for banging in nails, but when we need a paperweight, we can't see the hammer as a potential paperweight.

It was demonstrated in the candle problem experiment, wherein subjects were asked to fix the candle on the wall so that wax wouldn't drip on the table below. They were given a box of thumbtacks, a book of matches, and, of course, a candle. While some subjects tried out some quite creative ways, many were not able to see that they could attach the box with a thumbtack to the wall, then place the candle in it. That is, until the thumbtacks were taken out of of the box, making it easier to recognize the box as a placeholder for the candle, and not just for the thumbtacks.

candle problem functionality fixedness

The interesting thing though is that five year olds don't suffer from functional fixedness because at that age any goal achieved with the use of an object is equal to any other goal. By age seven though, children tend to treat the intended purpose of an object as special. And why we might thing that this is a result of an industrial, object-focused society, it turns out that functional fixedness is cross-cultural.  any goal to be achieved with an object is equivalent to any other goal.

Functional fixedness doesn't fully kill creativity. As we saw with the candle problem, some of the subjects came up with quite creative ideas, including using melted wax to attach the candle to the wall. What it does, is it kills our efficiency by making us blind to how we can use the objects around us to achieve our goals. So, what can you achieve with the paper clips on your desk?

Ambiguity effect 

The ambiguity effect affects decision making when there is a lack of information. People tend to select options for which the probability of a favourable outcome is known over options where the probability of a favourable outcome is unknown. This causes people to make it a rule of thumb to avoid options where information is missing.

An example is the way people invest money. Even though the stock market is likely to provide a significantly higher return from volatile investments such as stocks and funds, investors might prefer the safer investment in bonds, for which the return is known.

This bias doesn't only make you risk-averse in the workplace (and if you're not taking risks, you're probably not learning much), it might even make you reluctant to adopt new practices in the workplace.

Negativity bias 

According to the negativity bias, people are more likely to recall the unpleasant memories than the pleasant ones, and act in ways that will help them avoid these events. A study by John Cacioppo from the Ohio State University shows that our brains react more strongly to negative stimuli, shown by higher electrical activity in the cerebral cortex when subjects were presented with negative images.

Another study by Prof. Teresa Amabile from the Harvard Business School asked more than 200 professionals from different industries and companies to fill out daily diaries over a number of months, describing one thing that stood out throughout their day. Amabile found, after studying more than 12,000 entries, that the negative effect on happiness of a setback was more than twice as strong as the positive effect of making a step forward on meaningful work. The effect of setback on increasing frustration was also over three times stronger than the effect of progress on decreasing frustration.

This can affect your work in two ways. First, it will make you remember your bad days more than your good days. While creative discontent is useful and necessary in making any kind of work better than it is, focusing on the negative can also block you from making progress. And let's be honest, no one likes to be grumpy all the time!

The second way this can hurt your work is that you will act in ways that will make you avoid negative events. Let's say one of your university professors  tactlessly put down one of your ideas. If you felt really bad about it, you might become less inclined to take the risk of openly sharing your ideas. This can hurt you in the long run because of your inability to see past that one negative event.

Bias blind spot 

If you read this article thinking that all these biases don't apply to you, you might suffer from bias blind spot. This bias makes us think that while biases do apply to others, we are immune to them.

bias blind spot cognitive bias

The term was coined by Emily Pronin, a social psychologist at Princeton University, and her colleagues. In their experiments, Pronin and her colleagues asked subjects to make judgments about themselves and about other subjects. The subjects demonstrated standard biases. However when they were explained the biases and asked how it might have affected their judgement, the subjects rated themselves to be less susceptible to bias than others.

Pronin and her colleagues' explanation is that when people asses themselves for bias, they look inward, looking through their own thoughts and feelings for bias. But biases operate unconsciously, and people use the introspection as a reliable indicative that they, unlike others, are immune to bias. Which is why you might be able to see the above biases in your team members, but not in yourself.

How can we get rid of a cognitive bias? 

The key to getting rid of these biases is so hard and oh, yet so simple. The key is in consciousness and practice. since biases mostly occur due to automatic processing, debiasing aims to reduce biases by encouraging people to use controlled processing through increased awareness.

Simply being told about one's biases rarely actually helps to overcome them. For example, people who exhibit overconfidence when reporting above-average performance exhibit the same bias when asked how good they are at overcoming overconfidence.

There is also the growing field of cognitive bias modification and Cognitive Bias Modification Therapy (CBMT), a sub-group of therapies based on modifying cognitive processes without medication and talk therapy. These techniques are technology assisted and don't require clinician support.

Do you catch yourself guilty of any of these cognitive biases, or only of the bias blind spot one? We'd love to hear from you in the comments below.