Executives are becoming increasingly aware how cognitive biases – non-conscious influences and quirks – influence thinking and decision-making. Often biases are very helpful as they enable people to make quick, efficient judgments and decisions with minimal cognitive effort. But they can also inhibit executives from looking at an issue objectively and considering valuable options and perspectives, with the result that poor decisions are made. In response, more organisations have been trying to educate their executives about cognitive biases to help them be more ‘self-aware’ when they are doing their job. Unfortunately, there is very little evidence that educating individuals about biases does anything to reduce their influence – simply because biases occur outside conscious awareness and thus people are literally unaware of them as they occur.
This was the subject of an article entitled ‘Beyond Bias’ in the Autumn 2015 issue of strategy + business (authors: Heidi Grant Halvorson and David Rock, both consultants attached to the NeuroLeadership Institute and Columbia Business School in the US) and made the interesting point that, as the negative effects of bias cannot easily be overcome by individuals, then it is more up to executives collectively – meaning teams, departments and whole organisations – to deal with the problem. More specifically, their paper presented, based on their research, a useful model – called the SEEDS model – of the 150 or so known common biases that groups them into five categories based on their underlying cognitive nature: Similarity, Expedience, Experience, Distance, and Safety. Their central message is that organisations and senior executive need to be aware of where and when the different types of bias can occur and put deliberate strategies in place proactively that compensate for their effects and so allow executives to make more effective decisions.
Here’s a quick summary of the most common types of bias within the five categories proposed in the article:
i) Similarity biases: – Ingroup bias – perceiving people who are similar to you more positively; and Outgroup bias – perceiving people who are different from you more negatively. The best way to mitigate similarity biases is to deliberately try to widen the range of people you associate with and to seek/emphasize any connections with those who appear different.
ii) Expedience biases:- These are mental short-cuts that help people make quick and convenient decisions, but which cause distorting effects. As pointed out in the classic book by Daniel Kahneman Thinking, Fast and Slow, the human brain has two parallel decision-making systems: ‘System 1’ which evaluates things quickly based on intuition and quick to recall personal experience but which often leads to poor decisions and ‘System 2’ which looks at things at greater length by applying extended conscious and systematic thinking. Dealing with Expedience biases is about getting people to use more System 2 thinking. Common examples of Expedience bias are:
Confirmation bias (seeking and finding evidence that confirms your beliefs and ignoring evidence that does not); Availability bias (making a decision based on the information that comes to mind most quickly); Anchoring bias (relying heavily on the first piece of information given – the anchor – when considering a decision); Halo effect (letting someone’s positive qualities in one area influence overall perception of that individual); and Representativeness Bias (believing that something that is more representative is necessarily more prevalent).
Expedience biases tend to crop up in decisions that need concentrated effort and tend to be exacerbated when people are in a hurry or are tired or under pressure. To mitigate the bias, organisations need to have a culture that encourages or prompts people to step off the easier cognitive path and challenge themselves and others with more System 2 type thinking. It is also helpful to mitigate Expedience biases by breaking down a problem into its component parts, routinely requiring a wider group of people – including outsiders – to be involved in making a decision, together with techniques like requiring a mandatory ‘cooling off’ period (e.g. overnight pause) before making a final decision.
iii) Experience biases:- These are about people tending to assume that what they themselves have experienced or see is all there is to consider and all of it is accurate. Common examples include: Blind spot (identifying weaknesses/biases in other people but not yourself); Fundamental attribution error (believing your own errors or failures are due to external circumstances, but others’ errors are due to intrinsic factors like character); Hindsight bias (seeing past events as having been predictable in retrospect); Illusion of Control (overestimating your influence over external events); and Egocentric bias (weighing information about yourself disproportionately in making decisions); and False consensus effect (overestimating the universality of your own beliefs or opinions).
Experience biases are particularly pernicious when they breed misunderstandings among people who work together, so they can respond well to deliberate organisational steps to mitigate them. For example, it is a good idea to set up practices for routinely seeking opinions from people who are not on the team or project. Other techniques include revisiting ideas after a break to seem the in a fresh, more objective light or setting aside time to look at yourself and your message through other people’s eyes.
iv) Distance biases:- These involve attaching greater importance to something that is nearer in terms of space, time or perceived ownership. Examples: Endowment effect (expecting others to pay more for something than you would pay yourself); and Affective forecasting (judging your future emotional states based on how you feel now).
v) Safety biases: – This type of bias involves emphasizing the avoidance of a bad result or loss more than winning a gain. Some examples: Loss aversion (making a risk-averse choice if the expected outcome is positive, but making a risk-seeking choice to avoid negative outcomes); Framing effect (basing a judgment on whether a decision is presented as a gain or as a loss, rather than on objective criteria); and Sunk costs (having a hard time giving up something after investing time or money even though the investment can’t be recovered).
In conclusion, the article suggests four interesting principles about managing for bias: i) Bias is universal – it is a a general human predisposition to try to make judgments fast and easily; ii) It is difficult to manage for bias in the moment you’re making a decision – meaning that leaders need to design compensating practices and processes in advance; iii) When designing bias-countering processes, particularly involve people who naturally tend to be more systematic or analytical than those who rely more on gut instinct or intuition; and iv) Individual cognitive effort is not enough: there needs to be an organisation-wide culture in which people continually remind one another that the brain’s default setting is egocentric and realize that better decisions will come from stepping back to seek out a wider variety of perspectives and views.