From the archive: Cultivating the craft of judgment

One of the benefits of being a TJ subscriber is full access to our decades-long archive of content – here we look back to a piece on unconscious bias from March 2013.

Jean Gomes explains the science behind unconscious bias.

As the speed, complexity and volatility of our world increases, the pressures on leaders to think effectively have moved to a new level. Thirty years of neuroscience and psychological research is helping to unravel the complexity of perception and decision-making, leading to an astonishing series of breakthroughs in understanding how we make choices. The implications of unconscious bias in helping leaders to make better decisions are deeply significant, but are more art than science.

Two groups of senior executives from the same leadership team are undergoing a short exercise that we’ve conducted across the world. Individuals in one group begin taking turns to recount an event in the previous few weeks that gives them reason to feel gratitude. Topics characteristically range from the achievements of their children and appreciation of their colleagues, to holidays.

Nearby, in different room, the other group has been given a grubby-looking Tupperware box and asked to carefully open it. Inside is a teeming mass of maggots. Most of the group flinch, occasionally someone will let out a scream and, except for the occasional angler in the team, they all feel a degree of disgust. They are then invited, but not forced, to hold a handful of the fly larvae for 20 seconds. At least one or two people in most groups will do it, but even the mere act of confronting the maggots is enough to elicit strong feelings of revulsion.

We then switch without pause to another exercise. Each group is given the same briefing sheet. It describes a crime in which a five-year-old girl is permanently paralysed in the crossfire of a hold-up. Two young men have been found guilty and have shown no remorse. A victim impact statement from the girl’s mother is included.

A sentence range from three to 16 years is given. The groups are asked to be the judge and give their sentence. Around half way into the session, an additional piece of information that the men had been victims of abuse in a children’s home is given.

We silently observe the discussion.

The bottom line result of our experiments is that the maggot groups give sentences that are between 20 and 30 per cent harsher than those of the gratitude groups. The qualitative nature of their discussion is significantly less reflective or questioning. They typically discount the information about abuse as being immaterial or even fail to discuss it. The ratio of statements to questions is much higher and they typically arrive at their judgement 30 to 40 per cent faster. Group consensus is also considerably stronger.

So what’s going on? It seems that a completely unrelated experience such as touching, or just contemplating, touching the maggots has created a need that the brain is unconsciously trying to meet. That need is to justify our feelings of physical revulsion – our gut reactions. It fulfils that need by sentencing the criminal more harshly than if they been mulling over the facts in a neutral mental or emotional state. Even mild feelings of revulsion can have a significant impact on our judgment.

The cultural norm in business is that we’re largely rational animals; that our judgment has been honed through our education, through decision support systems, technology and processes. The reality is that [pullquote]irrationality and emotions play a huge role in our choices[/pullquote]. A wide range of research across brain science and psychology into self-deception and irrationality has been grouped together under the term ‘unconscious bias’. Increasingly, these findings are solidifying into a practical body of ideas and tools to help us reconcile the paradoxes of the conscious and unconscious minds.

Getting executives to move from mildly interested to a state of this is critical to understand means going into some difficult territory. The findings of unconscious bias highlight a multitude of ways in which we can make bad choices and have a negative impact on others, but an underlying mechanism – self-deception – frequently cuts in to prevent us from acceptance. Self-deception can be described as true information being preferentially excluded from our consciousness and replaced by false information. The impact on a leader’s performance is considerable, covering a wide scope of issues:

  • how we assess risk
  • the acceptance of advice
  • the ability to process facts objectively
  • giving or withholding support to others
  • how we recruit or overlook talent
  • how we promote people or hold them back.

How our mental short-cuts kill high-performance choices

One of the most well established performance findings in recent years is the positive influence of team diversity on success. Across problem-solving, conflict resolution and creativity, teams with diverse racial, ethnic and cultural backgrounds are, on almost every measure, more effective than homogeneous groups. But even when leadership teams confront these facts openly and seek to increase the variety of talent, many still fail.

The challenge may lay in understanding the difference between unwillingness to change and inability to do so. Issues of diversity are tough to bring into the open. Confronting prejudice of any kind in one’s self provokes feelings of shame and defensiveness. The bias effects of self-deception and stereotyping may be playing a substantial role, holding team diversity at bay.

Our brains continually seek to automate decision-making processes in order to maintain efficiency. Stereotyping is one way in which the brain does this, using an instantaneous, unthinking generalisation filing process: that’s one of those people, situations or things. At its root is the belief that certain positive or negative traits and behaviours are inherent in certain types of people.

For example, we might unknowingly link someone with a thin body to being disciplined in their thoughts or think that their nationality represents their motivation or personality traits: an Asian candidate as being inherently disciplined or a German one being intrinsically rational, for example. ‘Parataxis’ is the effect on our judgment of meeting someone who seems familiar. If you meet a new person who strongly reminds you of somebody you either liked or loathed, the chances are you will assume those characteristics are likely in this stranger.

We assume our organising principles are accurate and, under pressure, we may believe that our stereotypes are reality because our thoughts and words breathe life into them.

Another part of this equation can be self-deception. If asked how much do you drink?, someone who consumes a bottle of wine every evening may intentionally report having only two glasses because they are ashamed to acknowledge the correct number. Or, they may simply not answer the question, regarding it as a personal matter. These are examples of being unwilling to disclose an acknowledged fact. But it is also possible that a person who drinks a bottle of wine a day may report drinking only two glasses because they genuinely believe they drink that much. This is called self-deception and is incredibly prevalent in much of the perception of our choices.

Awareness – the operating system of better judgment

The unwilling-unable difference is the distinction between deliberately hiding something from others and unconsciously hiding something from yourself.

[pullquote]Awareness is the starting point for evolving past the inherent limitations in judgment that unconscious bias creates[/pullquote]. One way to do this is to take the Implicit Association Test developed by researchers Brain Nosek, Anthony Greenwald and Mahzarin R. Banaji at Harvard. Go to the following website to try one of the several available:https://implicit.harvard.edu/implicit/demo/selectatest.html

The Implicit Association Test makes it possible to pierce both of these types of hiding. It measures implicit attitudes and beliefs that people are either unwilling or unable to report.

Self-deception is an evolutionary trait that enables us to make rapid decisions.  Its role is central to our fight-and-flight response as well as to the habit-forming processes of the brain and the divide between our rational judgments and irrational reactions.

Lying and self-deception are, it seems, inextricably linked. If you accept that lying gives you an advantage, it’s a logical next step to see that being able to deceive oneself makes lying more effective. It reduces our chances of being caught – the social equivalent of camouflage. Self-deception releases more brainpower to focus on our goals of evasion or getting what we want when we don’t have to deal with the contradiction of the truth. Of course, when we believe our lies, we also avoid confronting the consequences of them too. A man might say I love you to achieve a short-term goal but blank out the longer-term responsibilities of commitment. And when we’re unaware of our lies, we’re freer to develop convincing counter-arguments, unencumbered by the truth.

Don’t believe your own arguments

Do you ever find yourself in the middle of conversation thinking I’m making a great argument here – but I don’t know if I actually believe what I’m saying? Researchers Hugo Mercier and Dan Sperber believe that human reasoning evolved to help us argue and get what we want, rather than to find the truth1.

In their research, reasoning isn’ t about improving knowledge or making better decisions, it’s about persuasion. They think of our ability to argue as an evolutionary advantage.

As we started to use language, we developed the ability to argue convincingly. Since the most convincing lines of reasoning aren’t always the most logical (or true), self-deception co-evolved – to be able to lie to ourselves so we can lie better to others.

The idea that we’re evolved to argue sometimes at the expense of truth may seem to offer a gloomy view of human reasoning, but the flip side of it is that argumentative minds are good at puncturing other people’s faulty reasoning. And bias versus bias produces surprisingly smart results. Lying acts as a stimulant to test thinking through debate.

The Watson Selection Task is one of the most famous tasks in the psychology of reasoning – it’s a logic puzzle that most people get wrong. An example of the puzzle is illustrated below: you are shown a set of four cards placed on a table, each of which has a number on one side and a coloured back. The visible faces of the cards show three, eight, red and brown.

The challenge is this: Which card(s) must you turn over in order to test the truth of the proposition that if a card shows an even number on one face, its opposite face is red?

The correct response is to turn the cards showing eight and brown, but no other card. In thousands of tests, less than 10 per cent of participants get it right.

When Mercier and Sperber put people into groups of five or six, 75 per cent of the groups got the answer right by proposing ideas and revising them in light of criticism – the results had no correlation to IQ. Success was dependent on how effectively everyone in the group argued, not how smart they were.

Judgment errors

Imagine you’re holding a job interview. Sitting before you is a seemingly wonderful candidate for an important job in your company. Let’s call her Jill. She’s been through three rounds of interviews with your senior team and completed a batch of tests to confirm she’s as smart as her CV says and that she’s not safeguarding some toxic personality flaw. The team really likes Jill and the recommendation is to hire her.

As CEO, you’re the final hurdle, sensing how she’ll fit into your culture and trying to look at her through clients’ eyes. As a lot of rational analysis has been done, you’re unashamedly relying on gut instincts to guide you. Fifty minutes into the meeting, it’s all looking and feeling good. But… just as you’re rounding up, Jill says something alarming in response to a fairly innocuous question. It jars with what you’ve heard and seen and you’re left feeling uneasy about her judgment. You’re due for another meeting in ten minutes, so you can feel yourself closing down even though you know that it’s not the right thing to do. You should extend the session by another 30 minutes and gently tease this out into the open and be clear about what you’ve just heard, but you don’t and you give in to the immediate pressures of the day.

Later that afternoon, that little moment of deviance looms all-important in your analysis. The team is surprised and frustrated by your reaction, but together you work it through and hire the candidate. Fast-forward several years and Jill is a star performer. So, it’s a happy ending. But it could have so easily gone the other way.

‘Recency’ or ‘primacy’ bias may have been at work on your judgment. This is where our decisions are overly affected by the sequencing or timing of information. If a manager is presented with information about two sets of candidates with identical skills and experience, but where information is ordered such that one candidate has all the positive attributes sequenced first and the other the negative qualities first, the second candidate typically never makes it through to the next round of interviews.

This plays out in different ways, depending on the nature of a decision-making process. In longer ones, with several rounds of recruitment interviews, recency bias takes hold. We know we’ re going to hear a lot of information throughout the process so we withhold judgment; but what we hear last can eclipse everything else we discover.

Think about how many candidates have blown their chances by saying something out of line in the last five minutes, leaving you in no doubt they are the wrong person. If this happened 30 minutes into the first interview, its effect would probably have been diluted or contextualised among the other hours of dialogue.

In a session where the decision-making process is much shorter, say a single interview, primacy bias takes hold. When time is short, you’re unconsciously looking for short-cuts to making the decision, so you grab hold of information rapidly – in effect you tend toward making the decision on the first or second piece of solid data – and filter information with a bias towards it confirming the decision you’ve already made. Here, first impressions can make or break a decision.

So, what’s the solution? The first is to accept that [pullquote]there’s no such thing as totally rational, unbiased judgment – we are inherently biased[/pullquote]. It’s a deep feature of life, enabling us to function. The second is to start paying more attention to what we’re doing. One simple lesson now being taught to judges to avoid being hijacked by their biases is to take better notes, paying particularly attention to separating facts and emotions.

Bias plays a crucial role in the hiring and development of employees, so it’s unsurprising that teaching managers how to incorporate this most subtle and ambiguous body of understanding is fast becoming more popular.

And one last thing: if you’ve read this and disagreed with it, you might be experiencing ‘reactance’ – the bias that creates the urge to do the opposite of what someone wants you to do to resist a perceived attempt to constrain your freedom of choice. Or, if you’re thinking you’re above such things, it might ‘bias blind spot’ is working within you – the tendency to see oneself as less biased than other people, or to be able to identify more biases in others than in oneself.

As leaders face ever more difficult choices, understanding unconscious bias provides one more powerful means of developing more resilient judgment skills.

Reference

1 http://www.dan.sperber.fr/wp-content/uploads/Mercier-SperberBBS.pdf

Colette.reed

Learn More →

Leave a Reply

Your email address will not be published. Required fields are marked *