Skip to main content

Briefing Paper: Unconscious Bias


On this page:

Overview

Human actions and decisions are influenced by a wide range of cognitive biases. This briefing paper introduces the concept of unconscious bias (also known as implicit bias), explains how it might affect our day to day decisions and behaviour, and suggests practical ways to identify and keep our biases in check.

How can biases influence us?

The effect of different biases means that we might, for example, be influenced towards believing in a particular view because many others do (the "bandwagon effect"); or perhaps show a tendency towards only listening to information that confirms our preconceptions ("confirmation bias"); or we might have a strong belief that something will have a positive effect because that is our ingrained expectation (the "placebo effect").[i] These are just a few examples of the many biases that can influence us.

People may do things knowing that they are affected by a certain type of bias, but it is often the case that we act without realising the biases that affect our behaviour in the moment. A prime example of this is the "Dunning-Kruger effect" whereby individuals who perform poorly at a task will frequently rate themselves as high performers because they lack the skills and awareness to recognise their incompetence. This also explains why, when a person starts to learn more about a particular topic they begin to realise their ignorance; that the more you learn, the more there is to be learned.

What is unconscious bias?

The realisation and acknowledgement of our own shortcomings is a helpful place to start when introducing unconscious bias. The human body typically sends 11 million bits of information to the brain per second. The conscious mind is only capable of processing around 50 bits per second, which means that our unconscious mind is constantly doing far more than we can acknowledge. Although this default processing mode in our brain means that humans are able to process complex matters quickly, it can also have less desirable outcomes, such as prejudiced decision-making, or snap judgments that discriminate.

"Unconscious bias is when we make judgments or decisions on the basis of our prior experience, our personal deep-seated thought patterns, assumptions or interpretations, and we are not aware that we are doing it." 

Descriptions of unconscious bias often start with a reference to the "fight or flight" instinct in humans.[v] We instinctively react very quickly as a survival mechanism. This instinct filters through to everyday stressful situations where a quick decision has to be made. In these situations, quick decision making is usually done on the basis of our individual predisposed biases. These biases start forming at a very young age and are conditioned by what we have seen and experienced in life up to that point. 

Ingroups and outgroups

In moments where quick thinking is required, we make instinctive assessments of others based on "ingroups" and "outgroups". An ingroup is a group to which a person belongs, or thinks they belong. An outgroup is a group to which a person does not belong, or thinks they do not belong.[vi] By making simple social categorisations we effectively divide the world into "us" (ingroup) and "them" (outgroup). The desire to associate with "people like us" is hardwired. It is natural to show some level of favouritism towards people like us. If we are to make decisions that don't unfairly affect the people who are not like us, we need to become conscious of the social categorisations we are automatically making, or the embedded stereotypes that frame our thinking. What we experience daily in our environment and culture also feeds into the way we view people. Run an image search on the internet using the word "nurse", or the term "CEO" (Chief Executive Officer) and consider what type of person appears in the majority of results.

See how we interact with ingroups and outgroups in this video from PwC

Examples of how unconscious bias affects decisions and attitudes

The Discrimination (Jersey) Law 2013 defines a set of protected characteristics which may be challenged by unconscious bias. The following three examples of protected characteristics demonstrate how race, sex, and sexual orientation can be unfairly judged:

Race: An analysis of Canadian employment data showed that there was discrimination against applicants with Asian (Chinese, Indian or Pakistani) names. Computer-generated resumés (CVs) were submitted in response to job adverts for which university-trained applicants were requested. The CVs with an Asian name were found to be at a notable disadvantage, with a 28 percent likelihood of getting called for interview compared to applicants with an Anglo name even when all qualifications were the same, and of Canadian origin (as opposed to qualifications from another country).

Gender: A frequently-cited Harvard study looked at the effect of blind auditions on the selection of musicians for orchestras. Following a change in hiring practice, from "only hiring musicians who were handpicked by the conductor" to holding auditions using a screen to conceal the identity and gender of each auditioning musician, the percentage of female musicians hired increased from 6 percent to 21 percent over a 23 year period.

Sexual orientation: Research from the U.S. in 2018 by the Human Rights Campaign Foundation identified that although there is a broad social acceptance of the LGBTQ (lesbian, gay, bisexual, transgender, queer) community, biases against LGBTQ workers still exist. 46 percent of LGBTQ workers are closeted at work, and the top reasons for not being open about sexual orientation and gender identity at work are:

  • The possibility of being stereotyped: 38%
  • The possibility of making people feel uncomfortable: 36%
  • The possibility of losing connections or relationships with coworkers: 31%
  • "Others might think I will be attracted to them just because I am LGBTQ": 27%

These are just three specific examples of the impact of unconscious bias. It is also important to consider what effect unconscious bias may have on our perception of people who are affected by other protected characteristics under the Discrimination (Jersey) Law 2013, for example, disability or age. In addition to these characteristics, there is still much more that could be affecting our judgment, such as: socio-economic background, place of birth, educational background, weight, style of clothing or hair, favourite sports team or type of music. Opinions could be formed based on these categories from a photograph, a social media profile, or a C.V., without even meeting the individual to whom they belong.

Intersectionality

Kimberlé Crenshaw, a lawyer and academic, coined the term "intersectionality" in 1989[xi]. Intersectionality is used to describe the oppression experienced by someone whose identity or lived experience is affected by an intersection (in very basic terms a mix) of social categories. For example, a white woman may encounter sexism. A black man may encounter racism. A black woman may encounter the compounded oppressions of both sexism and racism. It is important to be aware of the potential effect of intersectionality when dealing with unconscious bias because it is usually the case that people's social categories do indeed intersect; humans are multi-faceted beings and our experience of the world is shaped by how others interpret those facets.

Watch a brief video introduction to Intersectionality here

How can we identify our hidden biases?

Take a test: The Project Implicit team at Harvard University created the Implicit Association Test which is freely accessible and takes no more than 15 minutes to complete. The test is designed to measure attitudes and beliefs that we might be unwilling or unable to report. By measuring how people make automatic associations between concepts and evaluations or stereotypes, the test demonstrates how an individual's beliefs may not tally with what they implicitly (or unconsciously) believe.

Take time: When making decisions it is vital to allow time to consider the reasons behind the decision and reflect on whether the decision was arrived at fairly. This is not easy to do. Nobel prize-winning economist and psychologist Daniel Kahneman wrote about "thinking fast" and "thinking slow", and named these as "System 1" and "System 2" thinking.

System 1 thinking is what happens in our subconscious, and as shown in the image above, it is how we do 95 percent of our thinking. This is a vital point to acknowledge, and is particularly important in the context of recruitment decisions. Consider the example of a candidate who appears well-suited to the job on paper, and interviews well, but somehow there is a gut feeling that something isn't quite right about them. The gut feeling is created by intuitive, subconscious System 1 thinking. A recruitment scenario like this is where the slow, rational System 2 thinking is more appropriate, so that gut feelings which are in fact prejudice in disguise can be quashed by rational thought.

Watch Daniel Kahneman talk about thinking fast versus thinking slow.

Listen to peers and colleagues: People are good at recognising others' biases, but few can acknowledge their own so easily. This is known as the "bias blind spot". This blind spot means that we are more biased than we either know or are prepared to admit, and that our biases are more easily identified by our peers. In the context of the working environment, the bias blind spot could be mitigated by encouraging an openness to constructive criticism or asking challenging questions, for example:

  •  "What was the selection criteria for this panel?"
  •  "What can we do to ensure a diverse point of view?"
  •  "Let's invite some different people next time?"

Making a lasting improvement with holistic change

If we take proactive steps to identify our hidden biases, we can start to understand the best way to mitigate any potential negative consequences, by "outsmarting" our default reactions and decisions. It has been acknowledged, however, that awareness of unconscious bias in isolation is not enough. Awareness of our biases is just the first step towards positive change. This must be supported by also identifying and addressing any systemic and structural issues within an organisation that may be in existence and which serve to perpetuate biases. Those issues could be unfair policies, differences in opportunity, or inequitable treatment of people.

Conclusion

Although humans are hardwired to favour "people like us", we are also capable of realising and challenging that behaviour. We need self-awareness and acceptance of the fact that our default mode is to quickly process information in our unconscious minds and make a conscious effort to find the time to pause and process the possible consequences of that default way of thinking. It might appear relatively easy to acknowledge a bias held against one particular social category, but it is critical to remember that people are multi-faceted, with a variety of identity attributes and lived experiences that intersect to form the nuanced individuals that we are. Developing the ability to consider the potential repercussions of our decision-making through the prism of people's intersectionality is an important step forward. We can build on this by actively supporting colleagues to work together to identify, question, acknowledge, and continuously improve not just ourselves but the systems and structures in which we exist.

Bibliography

[i] https://www.mentalfloss.com/article/68705/20-cognitive-biases-affect-your-decisions

[ii] https://link.springer.com/article/10.3758/s13423-017-1242-7

[iii] https://www.britannica.com/science/information-theory/Physiology

[iv] https://www.st-andrews.ac.uk/media/human-resources/equalitydiversity/training/RS%20unconscious-bias-briefing-2015.pdf

[v] https://www.health.harvard.edu/staying-healthy/understanding-the-stress-response

[vi] http://www.holah.karoo.net/tajfestudy.htm

[vii] https://www.jerseylaw.je/laws/revised/Pages/15.260.aspx#_Toc2753715

[viii] http://www.hireimmigrants.ca/wp-content/uploads/Final-Report-Which-employers-discriminate-Banerjee-Reitz-Oreopoulos-January-25-2017.pdf

[ix] https://gap.hks.harvard.edu/orchestrating-impartiality-impact-%E2%80%9Cblind%E2%80%9D-auditions-female-musicians

[x] https://assets2.hrc.org/files/assets/resources/AWorkplaceDivided-2018.pdf?_ga=2.181520935.164194852.1539610864-1464585859.1505932243

[xi] https://chicagounbound.uchicago.edu/cgi/viewcontent.cgi?article=1052&context=uclf

[xii] https://implicit.harvard.edu/implicit/takeatest.html

[xiii] https://hbr.org/2015/05/outsmart-your-own-biases

[xiv] https://pdfs.semanticscholar.org/7ec4/3fd940dcc9fc7fef2a3d6c4eeaba2fff0455.pdf

[xv] https://hbr.org/2015/05/outsmart-your-own-biases

[xvi] https://www.forbes.com/sites/janicegassam/2020/12/29/your-unconscious-bias-trainings-keep-failing-because-youre-not-addressing-systemic-bias/