Unconscious Bias: what is it and what can we do?
What is bias?
Our brain is constantly bombarded with information. In order to manage it all, it is looking for patterns and takes shortcuts. Biases are the result of this.
A bias is a tendency, inclination, or prejudice toward or against an idea, object, group, or individual. Biases can be positive or negative and are often based on stereotypes, rather than actual knowledge of an individual or circumstance. Whether positive or negative, such cognitive shortcuts can result in prejudgments that lead to rash decisions or discriminatory practices.
People are naturally biased — they like certain things and dislike others, often without being fully conscious of their prejudice. Bias is acquired at a young age and it is often a result of one’s upbringing.
Implicit biases are pervasive and robust. Everyone possesses them, even people with avowed commitments to impartiality such as judges!
Implicit and explicit (conscious) biases are generally regarded as related but distinct mental constructs. They are not mutually exclusive and may even reinforce each other. Some research, however, suggests that implicit attitudes may be better at predicting and/or influencing behaviour (when it’s automatic and spontaneous) than self-reported explicit attitudes.
The implicit associations we hold arise outside of conscious awareness; therefore, they do not necessarily align with our declared beliefs or even reflect stances we would explicitly endorse.
We generally tend to hold implicit biases that favour our own ingroup. However, research has shown that in certain cases we can hold implicit biases against our ingroup (e.g., black individuals can have biases against black people, women can have a gender bias). This categorization (ingroup vs. outgroup) is often automatic and unconscious.
Implicit biases have real-world effects on our behaviour. For example, they can affect hiring decisions.
Education does not protect against biases! No relationship has been found between education and unconscious bias, suggesting that even highly educated individuals are susceptible to bias.
…and some good news? Implicit biases are malleable; therefore, the implicit associations that we have formed can be gradually unlearned and replaced with new mental associations.
“Implicit biases come from the culture. I think of them as the thumbprint of the culture on our minds. Human beings have the ability to learn to associate two things together very quickly — that is innate. What we teach ourselves, what we choose to associate is up to us.”
- Dr. Mahzarin R. Banaji, quoted in Hill, Corbett, & Rose, 2010, p. 78
Bias Blind Spot
Everyone has some form of unconscious bias. It’s part of what makes us human. But when you examine our understanding of this — things get a little peculiar.
For example; what percentage of the population do you think is more biased than the average person?
Research has shown that:
Most people have no idea of how biased they actually are…
Most people believe the people around them are more biased than they themselves are.
“The most telling finding was that everyone is affected by blind spot bias — only one adult out of 661 said that he/she is more biased than the average person.”
This is the bias blind spot. Our ability to perceive bias in others is actually pretty good. Our ability to perceive bias in ourselves is generally dreadful… This leads to bias that can damage the way we work and an inability to tackle it — because we can’t see it.
Why is this a problem?
The bias blind spot can be incredibly problematic. When we operate from within our blind spot; it’s been shown that we are more likely to reject the input of peers and/or experts in a subject and that we are also least likely to benefit from education and training concerning our particular bias.
This can be explained rationally too. Society teaches us that bias is a bad thing — even the word bias has negative connotations. Most of us would prefer not to think of ourselves as the kind of people who do bad things. This leads us to believe that we must be rational and that our actions and judgments are accurate and free from biases. However, this blind spot doesn’t stop us from seeing the flaws (biases) in others.
Awareness of our vulnerability to bias is the first step towards reducing our unconscious bias…
Examples of unconscious bias
When YouTube launched their video upload app for iOS, between 5 and 10 percent of videos uploaded by users were upside-down. This baffled the team. Were people shooting videos incorrectly? No. The early design was the problem. It was designed for right-handed users, but phones are usually rotated 180 degrees when held in left hands, which results in upside-down videos… Without realising it, they’d created an app that worked best for the almost exclusively right-handed developer team!
(Some) soap dispensers
You have probably seen this video showcasing a soap dispenser failing to recognise darker skin tones. Most likely, the soap dispenser uses an optic sensor to detect when a hand is placed underneath. Lighter skin reflects more light so it activates the sensor. That’s not the case with darker skin, which results in discrimination. Similar issues have been reported with popular wearables.
How do we measure our unconscious bias?
Some researchers have utilised techniques such as facial electromyography (EMG) and cardiovascular and hemodynamic measures as physiological approaches to measuring implicit prejudices. One avenue of exploration focuses on physiological instruments that assess bodily and neurological reactions to stimuli, such as through the use of functional Magnetic Resonance Imaging (fMRI). These studies often focus primarily on the amygdala, a part of the brain that reacts to fear and threat and also has a known role in race-related mental processes. Findings from these studies indicate that amygdala activity can provide insights into unconscious racial associations. However, unless we have access to an unlimited amount of resources and an MRI machine, we need an easier way to identify our biases…
The Implicit Association Test (IAT) measures attitudes and beliefs that people may be unwilling or unable to report. The IAT measures the strength of associations between concepts (e.g., black people, gay people) and evaluations (e.g., good, bad) or stereotypes (e.g., athletic, clumsy). The main idea is that making a response is easier when closely related items share the same response key. You can measure your implicit bias by taking the IAT here.
What can we do to reduce it?
“The best we can do is a compromise: learn to recognize situations in which mistakes are likely and try harder to avoid significant mistakes when the stakes are high.”
Daniel Kahneman, “Thinking, Fast and Slow”
Experience and exposure to diversity: being exposed to individuals from diverse backgrounds can reduce implicit bias. In the workplace this would translate to blended teams. For example, using blended teams in panel interviewing can lead to an increase in the number of hired diverse workers.
Measure to change: Become aware of our biases. The Implicit Association Test measures associations between concepts, stereotypes, and evaluations. It measures attitudes and beliefs that people may be unwilling or unable to report. The IAT may be especially interesting if it shows that you have an implicit attitude that you did not know about.
UX teams should try testing their products with users from various backgrounds and groups.
Self-check techniques: Kristen Pressner discovered a method to self-check (a diagnostic!) for biases that she calls “Flip It to Test It.” Mentally flip whoever or whatever you’re talking about to test yourself. If the “flipped” result feels weird, you may have uncovered a bias.
Watch our triggers: The theories of ego depletion and decision fatigue show how different forms of mental tiredness can lead to increased automatic and thereby bias-prone decision making. Our cognitive resources are limited, and as the day goes on, they decline. So when we’re tired or hungry, our brains rely more on that unconscious, fast processing, which studies have shown is more prone to bias! The best practice from my perspective would be to not make the decision at the end of the day but to leave it until the following morning.
Use technology (but be careful!): a number of tools have been developed to check job adverts and webpages for hidden biases (example). We should keep in mind, however, that technology is subjected to biases too.
“Blinding” oneself from knowing people’s demographics when making decisions: for example, hiding an applicant’s demographic information.
However, we need to be careful as certain interventions can increase bias…
In a study being told to “stop” prejudice and racism using brochures or primes led to more prejudiced and racist responses; being asked to consider the value of non-prejudice and open-mindedness leads to less prejudiced and racist responses.