Implicit bias in healthcare: clinical practice, research and decision making Authors: Dipesh P Gopal,A Ula Chetty,B Patrick O’ Donnell,C Camille GajriaD and Jodie Blackadder-WeinsteinE A B S T R A C T Bias is the evaluation of something or someone that can be positive or negative, and implicit or unconscious bias is when the person is unaware of their evaluation. This is particularly relevant to policymaking during the coronavirus pandemic and racial inequality highlighted during the support for the Black Lives Matter movement. A literature review was performed to define bias, identify the impact of bias on clinical practice and research as well as clinical decision making (cognitive bias). Bias training could bridge the gap from the lack of awareness of bias to the ability to recognise bias in others and within ourselves. However, there are no effective debiasing strategies. Awareness of implicit bias must not deflect from wider socio- economic, political and structural barriers as well ignore explicit bias such as prejudice. KEYWORDS: implicit bias, unconscious bias, diagnostic error, cognitive bias DOI: 10.7861/fhj.2020-0233 Introduction Bias is the evaluation of something or someone that can be positive or negative, and implicit or unconscious bias is when the person is unaware of their evaluation.1,2 It is negative implicit bias that is of particular concern within healthcare. Explicit bias, on the other hand, implies that there is awareness that an evaluation is taking place. Bias can have a major impact on the way that clinicians conduct consultations and make decisions for patients but is not covered in the medical field outside clinical reasoning. Conversely, it is commonly highlighted in the world of business.3,4 The lack of awareness of implicit bias may perpetuate systemic inequalities, Authors: Ageneral practitioner and in-practice fellow, Barts and The London School of Medicine and Dentistry, London, UK; Bgeneral practitioner and university tutor, University of Glasgow, Glasgow, UK; Cgeneral practitioner and clinical fellow in social inclusion, University of Limerick, Limerick, Ireland; Dgeneral practitioner and clinical teaching fellow, Imperial College London, London, UK; ERoyal Air Force general practitioner and researcher, Royal Centre of Defence Medicine, Edgbaston, UK EDUCATION AND TRAINING Implicit bias in healthcare: clinical practice, research and decision making resulting in lower pay for clinicians from ethnic minorities and lack of female surgeons in senior positions, for example.5,6 Cognitive bias may explain political decisions in the coronavirus pandemic framing ventilators as ‘lifesaving’ and subsequent investment over public health non-pharmaceutical measures: framing bias.7 Clinicians during the pandemic may have been tempted to prescribe medication despite lack of clear evidence due to fear of lack of action: action bias.8 Action bias may have been exhibited by stressed members of the public when panic buying groceries despite reassurance of stable supply.9 Cognitive bias may affect the way clinicians make decisions about healthcare given the novelty of the disease and evolving evidence base. Politicians may prioritise resources to goals that will provide short- term benefit over long-term benefit; this might include increases critical care capacity over public health investment: present bias.7 Given the amount of poorly reported and implemented non-peer reviewed pre-print research during the pandemic, many clinicians may implement easily available research amplified by media rather than taking a critical look at the data: availability bias.8,10 This may be compounded by physical and emotional stress. Media reporting of the coronavirus in the USA as the ‘Chinese virus’ was linked with increasing anti-American bias towards east Asians.11 This article aims to identify the potential impact of bias on clinical practice and research as well as clinical decision making (cognitive bias) and how biases may be mitigated overall. Methods A non-systematic literature review approach was used given the heterogeneous and mixed-method study of bias in healthcare; such a topic would be unamenable to systematic review methodology. Inclusion criteria included English language articles which were identified by searching PubMed and the Cochrane database from January 1957 to December 2020 using the following search terms: ‘implicit bias’, ‘unconscious bias’, ‘cognitive bias’, and ‘diagnostic error and bias’. The highest level of evidence was prioritised for inclusion (such as recent systematic reviews, meta-analyses and literature reviews). Opinion articles were included to set context in the introduction and the discussion sections to identify possible future direction. Articles mentioning bias modification in clinical psychiatry were excluded as these focused on specific examples of clinical care rather than contributing to a broad overview of the potential impact of bias in medicine. INEQUALITY AND PREVENTION Future Healthcare Journal 2021 Vol 8, No 1: 40–8 40 © Royal College of Physicians 2021. All rights reserved. How does bias work and where does it come from? Decision making can be understood to involve type 1 and type 2 processes (see Fig 1).12,13 Type 1 processes are fast, unconscious, ‘intuitive’ and require limited cognitive resources.13,14 They are often known as mental shortcuts or heuristics, which allow rapid decision making. In contrast, type 2 processes are slower, conscious, ‘analytic’ and require more cognitive resources.13 The above is known as dual process theory (DPT). It is type 1 processing that makes up the majority of decision making and is vulnerable to error. If this occurs in consecutive decisions, it can lead to systematic errors, such as when a car crash that occurs after errors in some of hundreds of tiny decisions that are made when driving a car.13 Despite the critique of implicit bias, such automatic decisions are necessary for human function and such pattern recognition may have developed in early humans to identify threats (such as predators) to secure survival.3 It is thought that our biases are formed in early life from reinforcement of social stereotypes, from our own learned experience and experience of those around us.15 The Implicit Association Test (IAT) is the commonest measure of bias within research literature. It was developed from review work which identified that much of social behaviour was unconscious or implicit and may contribute to unintended discrimination.16,17 The test involves users sorting words into groups as quickly and accurately as possible and comes in different categories from disability to age, and even presidential popularity. For the gender- career IAT, one vignette might include sorting gender, or names Fig 1. Decision-making processes. a) The interaction between type 1 and type 2 processes allows diagnoses to be made from patient presentations. T = ‘toggle function’; the ability to switch between type 1 and type 2 processes. b) The type 1 processes that control calibration of decision making to make a diagnosis. Adapted with permission from Croskerry P, Singhal G, Mamede S. Cognitive debiasing 1: origins of bias and theory of debiasing. BMJ Qual Saf 2013;22(Suppl 2):ii58–64. Pattern recognition Repetition Type 2 processes Type 1 processesRecognised Not recognised Patient presentation Calibration Diagnosis Pattern processor Executive override Dysrationalia override T a b Type 1 processes Emotional processes Over-learned processes Implicitly learned processes Hard-wired processes Calibration Diagnosis (eg Ben or Julia), into the family or career categories. This has been well summarised in meta-analyses comparing the ability of the IAT to predict social behaviour.18,19 Furthermore, Oswald and colleagues found that the IAT was not a predictor of markers of discrimination when looking at race and ethnicity.19 While the IAT is used widely in research literature, opponents of the IAT highlight that it is unclear what the test actually measures, and comment that the test cannot differentiate between association and automatically activated responses.20 Furthermore, it is difficult to identify associations, bringing further confusion to the question of how to measure the activity of the unconscious mind. Given these conflicting views, while IAT testing is commonly used, it cannot be universally recommended.21 There are ethical concerns that the IAT could be used as a ‘predictive’ tool for crimes that have not yet occurred, or a ‘diagnostic’ tool for prejudice such as racism.22 The IAT should be used as a tool for self-reflection and learning, rather than a punitive measure of one’s biases or stereotypes.23 The test highlights individual deficiencies rather than looking at system faults. A systematic review focusing on the medical profession showed that most studies found healthcare professionals have negative bias towards non-White people, graded by the IAT, which was significantly associated with treatment adherence and decisions, and poorer patient outcomes (n=4,179; 15 studies).24 A further systematic review showed that healthcare professionals have negative bias in multiple categories from race to disability as graded by the IAT (n=17,185; 42 studies) but it did not link this to outcomes.25 The reviews bring into © Royal College of Physicians 2021. All rights reserved. 41 Implicit bias in healthcare question healthcare provider impartiality which may conflict with their ethical and moral obligations.25,26 Bias in clinical medicine Using the IAT, US medical students (n=4,732) and doctors (n=2,284) were demonstrated to have weight bias (ie prejudice against those who are overweight or obese) which may stem from a lack of undergraduate education in the causes of obesity and how to consult sensitively.27–29 Many healthcare professionals believe that obesity is due to a lack of willpower and personal responsibility, but it may be due to other factors such as poverty and worsening generational insomnia.30 –32 Similarly the obesity IAT evaluated across 71 countries (n=338,121) between 2006 and 2010 identified that overweight individuals had lower bias towards to overweight people, while countries with high levels of obesity had greater bias towards obese people.33 There is evidence to corroborate anecdotal reports of female doctors being mistaken for nurses while at work, and male members of staff and male students being mistaken for doctors despite the presence of a clear female leader.34,35 Boge and colleagues found that patients (n=150) were 17.1% significantly less likely to recognise female consultants as leaders compared with their male counterparts, and 14% significantly more likely to recognise female nurses as nurses compared with male nurses.34 In addition, female residents (registrars) have significantly negative evaluations by nursing staff compared with their male colleagues despite similar objective clinical evaluations between male and female colleagues.36,37 One alarming disparity that deserves mention is gender-specific differences in myocardial infarction presentation and survival. While members of both genders present with chest pain, women often present with what is known as ‘atypical’ symptoms such as nausea, vomiting and palpitations.38,39 The mention of ‘atypical’ in the literature is misleading given that women make up half of an average population. Large cohort studies (n=23,809; n=82,196) have found increased in-hospital mortality by 15–20% (adjusted odds ratios) for female patients compared with male patients, which contrasts with smaller cohorts (n=4,918; n=17,021), which have found no differences.40 –43 Interviews with patients under the age of 55 (n=2,985) who had suffered myocardial infarctions revealed that women were 7.4% (absolute risk) more likely to seek medical attention, and were 16.7% less likely to be told their symptoms were cardiac in origin.44 This data indicates a need for education of the public and healthcare professionals alike about the symptoms of a myocardial infarction in women. In 2019, the MBRRACE-UK report revealed that maternal and perinatal mortality in pregnancy was five times higher in Black women compared with White women, and this data has also been replicated in US data with a similar order of magnitude of three to four times.45,46 While official reports have not offered clear explanations as to the causes of such differences, it has been suggested that a combination of stigma, systemic racism and socio-economic inequality are relevant causative factors rather than biological factors alone.47,48 Lokugamage calls for healthcare professionals to challenge their own biases and assumptions when providing care using a ‘cultural safety’ model.49,50 Such a model could help identify areas for power imbalances in the healthcare provider–patient relationship and resultant inequalities. Cultural competence training has been evaluated in a Cochrane systematic review, and a number of randomised controlled trials (RCTs) included did show that training courses (of varying lengths) did provide some improvement in cultural competency and perceived care quality at 6–12 months’ follow-up (five studies; 337 professionals; 84,00 patients).51 However, there was limited effect on improving objective clinical markers such as decreasing blood pressure in ethnic minorities for example. Bias in research, evidence synthesis and policy While scientific and medical research is thought to be free from outside influence, ‘science is always shaped by the time and the place in which it is carried out’.52 The research questions that are developed and answered depend on the culture and institutions in our societies, including public–private industry partnerships. During research conduct, minimisation of bias (specifically selection and measurement bias) within research is an important factor when attempting to produce generalisable and robust data. Canadian life science researchers note a consistent trend of small research institutions having a 42% lower chance of research grant application being successful compared with large research institutions.53,54 In contrast, gender bias within the wider realm of research may discriminate against women in the selection of grant funding as well as in terms of the hierarchical structure of promotion in academic institutions.55,56 At academic conferences and grand rounds, men were 21–46% more likely to introduced by their professional titles by women compared with when women were introduced by men.57–59 Women were 8–25% more likely to introduce a fellow woman by her title compared with men introducing men. However, these differences were not always observed.60 Taking an international perspective, when an IAT was used to assess healthcare professionals’ and researchers’ (n=321) views on the quality of research emanating from ‘rich’ and ‘poor’ countries (assessed by gross domestic product), the majority associated ‘good’ (eg trustworthy and valuable) research with ‘rich’ countries.61 This alone does not mean much, but by using a randomised blinded crossover experiment (n=347), swapping the source of a research abstract from a low- to high-income country, improved the assessment of the research in English healthcare professionals.62 A systematic review (three randomised control trials; n=2,568) found geographic bias for research from high-income countries or more prestigious journals over low- income countries or less prestigious journals.63 This highlights how publication bias for research from high-income countries could neglect a wealth of data from low-income countries that is valid, even if it is not published, or only published in lower impact journals. These data highlight a greater need for more objective assessments of research, including multiple layers of blinding with a journal review board and peer reviewers from low-income countries.63 However, blinding may be beneficial when recruiting people to jobs from job applications given that application photos may influence the selection process at resident or registrar level.64 It may be difficult to anonymise citations or publication data during academic selection processes. Bias comes into play during evidence generation and application of evidence-based policy (EBP) where scientific-based, single- faceted solutions can be seldom applied to multi-faceted or ‘wicked’ problems.65 These problems are poorly defined, complex, dynamic issues where solutions may have unpredictable consequences (such as climate change or obesity).66 Parkhurst identifies two forms of evidentiary bias in policymaking that can occur in the creation, selection and interpretation of evidence: technical bias Dipesh P Gopal, Ula Chetty, Patrick O’ Donnell et al 42 © Royal College of Physicians 2021. All rights reserved. and issue bias.67 Technical bias is where use of the evidence does not follow scientific best practice, such as ‘cherry-picking’ rather than systematically reviewing the evidence to support a certain position. In contrast, issue bias occurs when the use of the evidence shifts political debate in a certain direction, such as presenting a policy with evidence reflecting one side of the debate. Cognitive biases and diagnostic errors Errors are inevitable in all forms of healthcare.68 The prevalence of diagnostic errors varies between different healthcare settings and may be partly due to cognitive factors as well as system related factors.69,70 Systematic reviews (76 studies; 19,123 autopsies) looking at studies where autopsies detected clinically important or ‘major’ errors involving principal underlying disease or primary cause of death found an error rate of 23.5–28% in adult and child inpatient settings.71,72 A systematic review conducted in primary care identified a median error rate of 2.5 per 100 consultations or records reviewed (107 studies (nine systematic reviews and 98 primary studies); 128.8 million consultations/records).73 Existing research on human factors using checklists to decrease hospital- associated infections and perioperative mortality supports emerging research that links bias to diagnostic errors.74–76 A systematic review assessing associations between cognitive biases and medical decisions found cognitive biases were associated with diagnostic inaccuracies in 36.5%–77% of case scenarios (7 studies; n=726) from mostly clinician survey-based data.77 There was an association found between cognitive bias and management errors in five studies (n=2,301). There was insufficient data to link physician biases and patient outcomes. The review was limited by a lack of definitions of the different types of cognitive biases in 40% of all studies (n=20) and a lack of systematic assessment of cognitive bias. Cognitive biases are one of several individual-related interweaving factors linked to errors, including inadequate communication, inadequate knowledge–experience skill set and not seeking help.78 There are many different types of cognitive bias which can be illustrated in the healthcare diagnostic context (see Table 1). Evidence-based bias training Making diagnoses is thought to depend on the previously mentioned type 1 and type 2 processes which make up DPT.87 Despite this, there has been a growing body of evidence suggestive that type 2 processing or ‘thinking slow’ is not necessarily better than type 1 processing ‘thinking fast’ in clinicians.12,88–90 Furthermore, there has been suggestion that proposed solutions (such as reflection and cognitive forcing; strategies that force reconsideration of diagnoses) to identify and minimise biases and debiasing checklists has limited effect in bias and error reduction.91–94 Small-scale survey-based data (n=37) suggested the presence of hindsight bias where clinicians disagree on the exact cognitive biases depending on the outcome of a diagnostic error (see Table 1).95 A systematic review (28 studies; n=2,665) on cognitive interventions targeting DPT for medical students and qualified doctors found several interventions had mixed or no significant results in decreasing diagnostic error rate.70 The vast majority of studies included small samples (n<200) and effects often did not extend beyond 4 weeks. Interventions included integration into educational curricula, checklists when making diagnoses, cognitive forcing, reflection and direct instructions. These interventions often come under the umbrella term of ‘meta- cognition’. A more recent systematic review and meta-analysis determined that diagnostic reflection improved diagnostic accuracy by 38% in medical students and doctors (n=1,336; 13 studies) with short-term follow-up.96 This implies that decreasing bias can only occur after a diagnostic error has taken place. The limited evidence base for decreasing bias may be due to methodological differences or intrinsic differences in study subjects in the clinical studies and reviews. Some clinicians may find a practical checklist when providing healthcare in order to minimise their own biases when making decisions (Box 1).97–99 The nature of decreasing bias through a single- faceted intervention may be very difficult as bias is a ‘wicked’ or multi-faceted problem.65 Unconventional methods of teaching bias may include a teaching bias to medical students in a non-clinical setting (such as a museum, a weekly series of case conferences examining health equity and implicit bias, and transformative learning theory).100–102 Transformative learning theory resembles what many consider to be key components of Balint groups and combines multiple single interventions (such as experience, reflection, discussion and simulation).102,103 Hagiwara and colleagues outlined three translational gaps from social psychology to medical training which may hinder the effectiveness of bias training to improve health outcomes.104 The first is a lack of evaluation of a person’s motivation to make change along with bias awareness. The second is that bias training does not come with clear strategies to mitigate bias and may result in avoidance or overfriendliness which may come across as contrived in specific situations (such as clinics with marginalised groups). The third is lack of verbal and non-verbal communication training with bias training, given that communication is the mediator between bias and patient outcomes. Verbal communication training may involve micro-aggressions.105 Discussion There are limited data to suggest reflective practice as a clear evidence-based strategy to decrease our biases on a clinician–patient level but options such as cultural safety checklists and previously outlined strategies (Box 1) could provide support to coalface clinicians.97–99 Better appreciation of biases in clinical reasoning could help clinicians reduce clinical errors and improve patient safety and provide better care for marginalised communities who have the worst healthcare outcomes.106,107 It is hoped that the training would help bridge the gap from the unawareness of bias to the ability to recognise bias in others and within ourselves to mitigate personal biases and identify how discrimination may occur.108 Awareness of implicit bias allows individuals to examine their own reasoning in the workplace and wider environment. It asks for personal accountability and a single question: ‘If this person were different in terms of race, age, gender, etc, would we treat them the same?’ However, there is a conflict between those suggesting bias training which may increase awareness of bias and the limited evidence to identify any effective debiasing strategy following the identification of biases.109 Advocates of bias training suggest that it should not be taught as an isolated topic but integrated into clinical specialty training.110 Others deduce that bias training would be more effective with measures of personal motivation and communication training along with evidence-based strategies to decrease implicit bias.101 Similarly, IAT testing should be administered with a caveat. To our knowledge at the time of writing, only the Royal College of Surgeons of England has identified the importance of © Royal College of Physicians 2021. All rights reserved. 43 Implicit bias in healthcare Table 1. Selected cognitive biases in a healthcare context with definitions illustrated with an example of a patient presenting with chest pain79–86 Type of bias Definition Practical example Affective or visceral bias Countertransference or a professional’s feeling towards the patient results in misdiagnosis. The patient presenting with chest pain reminds you of a relative that you know well, so you do not perform a full history or examination Anchoring bias Focusing on initial information in a patient’s presentation results in an early diagnosis made despite pertinent information available later during information gathering. You perceive the patient presenting with central chest pain to have gastro-oesophageal reflux and do not change your provisional diagnosis despite history- taking revealing chest pain radiating to the back. Premature closure Making a diagnosis before a full assessment is performed. You make a diagnosis of pneumonia for a patient presenting with right-sided chest pain and breathlessness with marked hypoxia but do not consider a pulmonary embolus as an additional contributory cause. Availability bias Recent encounters with a specific disease keep that disease in mind (more available) and increases the chance of making that diagnosis. Alternatively, less frequent encounters with a disease (less available) decrease the chance of making that diagnosis. You perceive patients with pleuritic chest pain to have a pulmonary embolism despite low overall risk and send them for a computed tomography pulmonary angiography as a result of recently missed pulmonary embolism. Confirmation bias Seeking and accepting only information that confirms a diagnosis rather than information that refutes a diagnosis. You perceive the patient with left sided chest pain and raised troponin to have a myocardial infarction but do not consider other causes of raised troponin. Commission (action) bias Action rather than inaction prevents patient harm driven by beneficence; ie, believing that more is better. You prescribed two antibiotics, against local guidance, to the patient who presented with right-sided chest pain diagnosed with pneumonia ‘just in case’. You perceive the patient recovery as a result of your action rather than a less virulent disease. Omission (inaction) bias Inaction rather than action prevents patient harm driven by non-maleficence; ie, believing that less is better. Omission bias is thought to be more prevalent than commission bias. You prescribed no antibiotics for the patient who presented with pleuritic chest pain diagnosed with a lower respiratory tract infection. The patient does not recover which you attribute to virulent disease progression rather than inaction. Diagnostic momentum Reinforcing a diagnosis that was once a possibility suggested by different stakeholders related to the patient including professionals that now becomes a certainty despite evidence to the contrary. This may involve continuing with a previous clinician’s management plan despite new information suggesting that this is unnecessary. You and your fellow team members agree with your consultant / attending physician who makes a provisional diagnosis of pneumothorax for a patient presenting with pleuritic chest pain but is contradicted by fevers and cough as symptoms. Gambler’s fallacy Believing that a condition cannot be the diagnosis having made the diagnosis repeatedly on several occasions; ie, the pre-test probability is affected by previous independent events. Reference to a gambler’s false belief that flipping a coin five times resulting with heads increases the chance of tails on the sixth occasion. You diagnose all of the five preceding patients presenting with chest pain as having a myocardial infarction and believe there is less chance that the next patient will have the same diagnosis. Overconfidence bias Overestimation in one’s own ability to know more than they actually do, also known as the Dunning–Kruger effect, placing more emphasis on judgement rather than objective markers. You diagnose a patient presenting with left sided pleuritic chest pain after blunt trauma as having soft tissue injury as they have a normal respiratory examination rather than making a provisional diagnosis of pneumothorax and sending the patient for chest X-ray. Dipesh P Gopal, Ula Chetty, Patrick O’ Donnell et al 44 © Royal College of Physicians 2021. All rights reserved. Table 1. Selected cognitive biases in a healthcare context with definitions illustrated with an example of a patient presenting with chest pain79–86 Type of bias Definition Practical example Sutton’s slip or law Making the most obvious diagnosis without considering other possibilities; named after bank robber Willie Sutton. You diagnose a young patient presenting with breathlessness and chest pain on exertion as late- onset asthma without considering less likely but possible diagnoses such as stable angina. Hindsight bias Believing a diagnosis is more likely after it becomes known compared with before it was known. There are three types known as memory distortion, inevitability and foreseeability. You are criticised for missing a diagnosis of pulmonary embolism in a middle-aged man who presented with chest pain and collapse when the computed tomography pulmonary angiography was initially reported as normal when the patient self- discharged home. The scan was amended the next day to show a pulmonary embolism, but the patient unfortunately died. (Continued) Box 1. Suggested checklist for making good clinical decisions97–99 Consider whether data are truly relevant, rather than just salient. Did I consider causes besides the obvious ones? How did I reach my diagnosis? Did a patient or colleague suggest the diagnosis? Did I ask questions that would disprove, rather than confirm, my current hypothesis? Have I been interrupted or distracted while caring for this patient? Is this a patient I do not like or like too much for any reason? Am I stereotyping the patient or presentation? Remember that you are wrong more often than you think! unconscious bias through an information booklet.111 The booklet entitled Avoiding unconscious bias seems unlikely because type 2 processing is integral to human thinking. There is a need for better- powered research into the effectiveness of strategies that can decrease implicit and cognitive bias, especially in the long term. Furthermore, organisations should consider whether bias training should be integrated into undergraduate and postgraduate curriculum as there are no effective debiasing strategies. As we move into data-driven societies, the impact of bias becomes every important.112 A simple example is a step counting mobile application that undercounted steps, it is probably due to the application being likely constructed to count steps in an ‘average person’ ignoring differences in gender, body mass index and ethnic origin.113 Within artificial intelligence, testing of data algorithms in different groups of people can help make algorithms more applicable to diverse populations and, ideally, diversely created algorithms should limit bias and increase applicability.114,115 Since the Black Lives Matter movement, many institutions may consider implementing bias training to mitigate racism. However, awareness of implicit bias or tokenistic bias training must not deflect from wider socio-economic, political and structural barriers that individuals face.116,117 Similarly, implicit bias should not be used to absolve responsibility, nor ignore explicit bias that may perpetuating prejudice and stereotypes.117 Action to correct the lack of non-White skin in research literature and medical textbooks is welcome.118–120 Furthermore, there has been much work to challenge the role of biological race in clinical algorithms and guidance (such as estimated glomerular filtration rate and blood pressure).121,122 Most pertinent to the pandemic, Sjoding and colleagues compared almost 11,000 pairs of oxygen saturations with pulse oximetry and arterial blood gas among Black and White patients.123 Black patients were 8–11% (relative risk three times) more likely to have lower arterial saturations when compared with pulse oximetry for White patients. This has implications for the coronavirus pandemic, respiratory conditions and is a call to tackle racial bias in medical devices. With regards to the structure of our healthcare systems, the understanding of personal bias can help identify judgements made during recruitment processes and help build representative leadership and workforce in the healthcare system of the population they serve.124 This is likely to help deliver better patient outcomes. Other strategies to decrease the impact of bias include using objective criteria to recruit, blind evaluations and salary disclosures.125 Additional measures include providing a system of reporting discrimination and measuring outcomes such as employee pay and hiring, and routinely measuring employee perceptions of inclusion and fairness. Such measures are fundamental to help mitigate inequality and associated adversity. ■ Acknowledgements We thank Prof Damien Ridge for his suggestions on this manuscript. Conflicts of interest Dipesh Gopal is an in-practice fellow supported by the Department of Health and Social Care and the National Institute for Health Research. Disclaimer The views expressed are those of the author(s) and not necessarily those of the NHS, the NIHR or the Department of Health. No honoraria paid to promote media cited including books, podcasts and websites. References 1 Blair IV, Steiner JF, Havranek EP. Unconscious (implicit) bias and health disparities: where do we go from here? Perm J 2011;15:71–8. 2 Santry HP, Wren SM. The role of unconscious bias in surgical safety and outcomes. Surg Clin North Am 2012;92:137–51. © Royal College of Physicians 2021. All rights reserved. 45 Implicit bias in healthcare 3 Re:Work. Watch unconscious bias @ work. Google, 2014. https:// rework.withgoogle.com/guides/unbiasing-raise-awareness/steps/ introduction [Accessed 26 February 2021]. 4 Lovallo D, Sibony O. The case for behavioral strategy. McKinsey & Company, 2010. www.mckinsey.com/business-functions/ strategy-and-corporate-finance/our-insights/the-case-for- behavioral-strategy [Accessed 26 February 2021]. 5 Appleby J. Ethnic pay gap among NHS doctors. BMJ 2018;362: k3586. 6 Moberly T. A fifth of surgeons in England are female. BMJ 2018; 363:k4530. 7 Halpern SD, Truog RD, Miller FG. Cognitive bias and public health policy during the COVID-19 pandemic. JAMA 2020;324:337–8. 8 Ramnath VR, McSharry DG, Malhotra A. Do no harm: reaffirming the value of evidence and equipoise while minimizing cognitive bias in the COVID-19 era. Chest 2020;158:873–6. 9 Landucci F, Lamperti M. A pandemic of cognitive bias. Intensive Care Med 2020 [Epub ahead of print]. 10 Glasziou PP, Sanders S, Hoffman T. Waste in covid-19 research. BMJ 2020;369:m1847. 11 Darling-Hammond S, Michaels EK, Allen AM et al. After ‘the China virus’ went viral: racially charged coronavirus coverage and trends in bias against Asian Americans. Health Educ Behav. 2020;47:870–9. 12 Kahneman D. Thinking, fast and slow. London: Penguin, 2012. 13 Croskerry P, Singhal G, Mamede S. Cognitive debiasing 1: origins of bias and theory of debiasing. BMJ Qual Saf 2013;22(Suppl 2): ii58–64. 14 Croskerry P. From mindless to mindful practice--cognitive bias and clinical decision making. N Engl J Med 2013;368:2445–8. 15 Chapman EN, Kaatz A, Carnes M. Physicians and implicit bias: how doctors may unwittingly perpetuate health care disparities. J Gen Intern Med. 2013;28:1504–10. 16 Project Implicit. Project Implicit. Project Implicit, 2011. https:// implicit.harvard.edu/implicit/selectatest.html [Accessed 26 February 2021]. 17 Greenwald AG, Banaji MR. Implicit social cognition: attitudes, self-esteem, and stereotypes. Psychol Rev 1995;102:4–27. 18 Greenwald AG, Poehlman TA, Uhlmann EL, Banaji MR. Understanding and using the Implicit Association Test: III: meta- analysis of predictive validity. J Pers Soc Psychol 2009;97:17–41. 19 Oswald FL, Mitchell G, Blanton H, Jaccard J, Tetlock PE. Predicting ethnic and racial discrimination: a meta-analysis of IAT criterion studies. J Pers Soc Psychol 2013;105:171–92. 20 Fazio RH, Olson MA. Implicit measures in social cognition: research: their meaning and use. Annu Rev Psychol 2003;54: 297–327. 21 Goldhill O. The world is relying on a flawed psychological test to fight racism. Quartz 2017. https://qz.com/1144504/the-world-is- relying-on-a-flawed-psychological-test-to-fight-racism [Accessed 26 February 2021]. 22 Jost JT. The IAT is dead, long live the IAT: Context-sensitive measures of implicit attitudes are indispensable to social and political psychology. Current Directions in Psychological Science 2019;28:10–9. 23 Sukhera J, Wodzinski M, Milne A et al. Implicit bias and the feedback paradox: exploring how health professionals engage with feedback while questioning its credibility. Acad Med 2019;94:1204–10. 24 Hall WJ, Chapman MV, Lee KM et al. Implicit racial/ethnic bias among health care professionals and its influence on health care outcomes: a systematic review. Am J Public Health 2015;105: e60–76. 25 FitzGerald C, Hurst S. Implicit bias in healthcare professionals: a systematic review. BMC Med Ethics 2017;18:19. 26 Parsa-Parsi RW. The revised declaration of Geneva: a modern-day physician’s pledge. JAMA 2017;318:1971–2. 27 Phelan SM, Dovidio JF, Puhl RM et al. Implicit and explicit weight bias in a national sample of 4,732 medical students: the medical student CHANGES study. Obesity (Silver Spring) 2014;22:1201–8. 28 Sabin JA, Marini M, Nosek BA. Implicit and explicit anti-fat bias among a large sample of medical doctors by BMI, race/ethnicity and gender. PLoS One 2012;7:e48448. 29 Rubin R. Addressing medicine’s bias against patients who are overweight. JAMA 2019;321:925–7. 30 Puhl RM, Latner JD, O’Brien K et al. A multinational examination of weight bias: predictors of anti-fat attitudes across four coun- tries. Int J Obes (Lond) 2015;39:1166–73. 31 Kim TJ, von dem Knesebeck O. Income and obesity: what is the direction of the relationship? A systematic review and meta-anal- ysis. BMJ Open 2018;8:e019862. 32 Gohil A, Hannon TS. Poor sleep and obesity: concurrent epidemics in adolescent youth. Front Endocrinol 2018;9:364. 33 Marini M, Sriram N, Schnabel K et al. Overweight people have low levels of implicit weight bias, but overweight nations have high levels of implicit weight bias. PLoS One 2013;8:e83543. 34 Boge LA, Dos Santos C, Moreno-Walton LA, Cubeddu LX, Farcy DA. The relationship between physician/nurse gender and patients’ correct identification of health care professional roles in the emer- gency department. J Womens Health 2019;28:961–4. 35 Cooke M. Implicit Bias in Academic Medicine: #WhatADoctor LooksLike. JAMA Intern Med 2017;177:657–8. 36 Galvin SL, Parlier AB, Martino E, Scott KR, Buys E. Gender bias in nurse evaluations of residents in obstetrics and gynecology. Obstet Gynecol 2015;126(Suppl 4):7S–12S. 37 Brucker K, Whitaker N, Morgan ZS et al. Exploring gender bias in nursing evaluations of emergency medicine residents. Acad Emerg Med 2019;26:1266–72. 38 Kawamoto KR, Davis MB, Duvernoy CS. Acute coronary syn- dromes: differences in men and women. Curr Atheroscler Rep 2016;18:73. 39 Mehta LS, Beckie TM, DeVon HA et al. Acute myocardial infarc- tion in women: a scientific statement from the American Heart Association. Circulation 2016;133:916–47. 40 Hannan EL, Wu Y, Tamis-Holland J et al. Sex differences in the treat- ment and outcomes of patients hospitalized with ST-elevation myo- cardial infarction. Catheter Cardiovasc Interv 2020;95:196–204. 41 Hao Y, Liu J, Liu J et al. Sex differences in in-hospital manage- ment and outcomes of patients with acute coronary syndrome. Circulation 2019;139:1776–85. 42 Wei J, Mehta PK, Grey E et al. Sex-based differences in quality of care and outcomes in a health system using a standardized STEMI protocol. Am Heart J 2017;191:30–6. 43 Her AY, Shin ES, Kim YH, Garg S, Jeong MH. The contribution of gender and age on early and late mortality following ST-segment elevation myocardial infarction: results from the Korean Acute Myocardial Infarction National Registry with Registries. J Geriatr Cardiol 2018;15:205–14. 44 Lichtman JH, Leifheit EC, Safdar B et al. Sex differences in the presentation and perception of symptoms among young patients with myocardial infarction: evidence from the VIRGO Study (Variation in Recovery: Role of Gender on Outcomes of Young AMI Patients). Circulation 2018;137:781–90. 45 Mothers and Babies: Reducing Risk through Audits and Confidential Enquiries across the UK. Saving lives, improving mothers’ care: Lessons learned to inform maternity care from the UK and Ireland Confidential Enquiries into Maternal Deaths and Morbidity 2014–16. Oxford: National Perinatal Epidemiology Unit, University of Oxford, 2018. www.npeu.ox.ac.uk/mbrrace-uk/ reports/confidential-enquiry-into-maternal-deaths [Accessed 26 February 2021]. 46 Review to Action. Report from nine maternal mortality review committees: Building US capacity to review and prevent maternal Dipesh P Gopal, Ula Chetty, Patrick O’ Donnell et al 46 © Royal College of Physicians 2021. All rights reserved. http://www.mckinsey.com/business-functions/strategy-and-corporate-finance/our-insights/the-case-for-behavioral-strategy https://implicit.harvard.edu/implicit/selectatest.html https://qz.com/1144504/the-world-is-relying-on-a-flawed-psychological-test-to-fight-racism http://www.npeu.ox.ac.uk/mbrrace-uk/reports/confidential-enquiry-into-maternal-deaths deaths. Review to Action, 2018. www.cdcfoundation.org/build- ing-us-capacity-review-and-prevent-maternal-deaths [Accessed 26 February 2021]. 47 Martin N, Montagne R. Black mothers keep dying after giving birth: Shalon Irving’s story explains why. NPR, 2017. www.npr. org/2017/12/07/568948782/black-mothers-keep-dying-after- giving-birth-shalon-irvings-story-explains-why?t=1565984653465 [Accessed 26 February 2021]. 48 Hamilton D. Post-racial rhetoric, racial health disparities, and health disparity consequences of stigma, stress, and racism. Washington Center for Equitable Growth, 2017. https://equita- blegrowth.org/working-papers/racial-health-disparities [Accessed 26 February 2021]. 49 Lokugamage A. Maternal mortality—undoing systemic biases and privileges: BMJ Opinion 2019. https://blogs.BMJ.com/ BMJ/2019/04/08/amali-lokugamage-maternal-mortality-undoing- systemic-biases-and-privileges [Accessed 26 February 2021]. 50 Richardson S, Williams T. Why is cultural safety essential in health care. Med Law 2007;26:699–707. 51 Horvat L, Horey D, Romios P, Kis-Rigo J. Cultural competence education for health professionals. Cochrane Database Syst Rev 2014;(5):CD009405. 52 Saini A. Superior: The return of race science. London: Fourth Estate, 2019. 53 Smith J, Noble H. Bias in research. Evid Based Nurs 2014;17:100–1. 54 Murray DL, Morris D, Lavoie C et al. Bias in research grant eval- uation has dire consequences for small universities. PLoS One 2016;11:e0155876. 55 Witteman HO, Hendricks M, Straus S, Tannenbaum C. Are gender gaps due to evaluations of the applicant or the science? A natural experiment at a national funding agency. Lancet 2019;393:531–40. 56 Conrad P, Carr P, Knight S et al. Hierarchy as a barrier to advance- ment for women in academic medicine. J Womens Health 2010;19:799–805. 57 Files JA, Mayer AP, Ko MG et al. Speaker introductions at internal medicine grand rounds: forms of address reveal gender bias. J Womens Health 2017;26:413–9. 58 Duma N, Durani U, Woods CB et al. Evaluating unconscious bias: speaker introductions at an international oncology conference. J Clin Oncol 2019;37:3538–45. 59 Paradiso MM, Rismiller KP, Kaffenberger JA. Unconscious gender bias: A look at speaker introductions at the American Academy of Dermatology. J Am Acad Dermatol 2021 [Epub ahead of print]. 60 Davuluri M, Barry E, Loeb S, Watts K. Gender bias in medicine: does it exist at AUA plenary sessions? Urology 2020 [Epub ahead of print]. 61 Harris M, Macinko J, Jimenez G, Mullachery P. Measuring the bias against low-income country research: an Implicit Association Test. Global Health 2017;13:80. 62 Harris M, Marti J, Watt H et al. Explicit bias toward high-income- country research: a randomized, blinded, crossover experiment of English clinicians. Health Aff (Millwood) 2017;36:1997–2004. 63 Skopec M, Issa H, Reed J, Harris M. The role of geographic bias in knowledge diffusion: a systematic review and narrative synthesis. Res Integr Peer Rev 2020;5:2. 64 Kassam A-F, Cortez AR, Winer LK et al. Swipe right for surgical residency: Exploring the unconscious bias in resident selection. Surgery 2020;168:724–9. 65 Parkhurst JO. Appeals to evidence for the resolution of wicked problems: the origins and mechanisms of evidentiary bias. Policy Sciences 2016;49:373–93. 66 Australian Public Service Commission. Tackling wicked problems : A public policy perspective. Australian Public Service Commission, 2007. www.apsc.gov.au/tackling-wicked-problems-public-policy- perspective [Accessed 26 February 2021]. 67 Parkhurst J. The politics of evidence: from evidence-based policy to the good governance of evidence. Abingdon: Routledge, 2017. 68 Shepherd L, LaDonna KA, Cristancho SM, Chahine S. How medical error shapes physicians’ perceptions of learning: an exploratory study. Acad Med 2019;94:1157–63. 69 Graber ML, Franklin N, Gordon R. Diagnostic error in internal medi- cine. Arch Intern Med 2005;165:1493–9. 70 Lambe KA, O’Reilly G, Kelly BD, Curristan S. Dual-process cogni- tive interventions to enhance diagnostic reasoning: a systematic review. BMJ Qual Saf 2016;25:808–20. 71 Shojania KG, Burton EC, McDonald KM, Goldman L. Changes in rates of autopsy-detected diagnostic errors over time: a system- atic review. JAMA 2003;289:2849–56. 72 Winters B, Custer J, Galvagno SM Jr et al. Diagnostic errors in the intensive care unit: a systematic review of autopsy studies. BMJ Qual Saf 2012;21:894–902. 73 Panesar SS, deSilva D, Carson-Stevens A et al. How safe is primary care? A systematic review. BMJ Qual Saf 2016;25:544–53. 74 Merali HS, Lipsitz SR, Hevelone N et al. Audit-identified avoidable factors in maternal and perinatal deaths in low resource settings: a systematic review. BMC Pregnancy Childbirth 2014;14:280. 75 Abbett SK, Yokoe DS, Lipsitz SR et al. Proposed checklist of hos- pital interventions to decrease the incidence of healthcare-associ- ated Clostridium difficile infection. Infect Control Hosp Epidemiol 2009;30:1062–9. 76 Haynes AB, Wieser TG, Berry WR et al. A surgical safety checklist to reduce morbidity and mortality in a global population. N Engl J Med 2009;360:491–9. 77 Saposnik G, Redelmeier D, Ruff CC, Tobler PN. Cognitive biases associated with medical decisions: a systematic review. BMC Med Inform Decis Mak 2016;16:138. 78 Seshia SS, Bryan Young G, Makhinson M et al. Gating the holes in the Swiss cheese (part I): Expanding professor Reason’s model for patient safety. J Eval Clin Pract 2018;24:187–97. 79 Croskerry P. Achieving quality in clinical decision making: cognitive strategies and detection of bias. Acad Emerg Med 2002;9:1184–204. 80 Croskerry P. The importance of cognitive errors in diagnosis and strategies to minimize them. Acad Med 2003;78:775–80. 81 Surry LT, Torre D, Trowbridge RL, Durning SJ. A mixed-methods exploration of cognitive dispositions to respond and clinical reasoning errors with multiple choice questions. BMC Med Educ 2018;18:277. 82 O’Sullivan ED, Schofield SJ. Cognitive bias in clinical medicine. J R Coll Physicians Edinb 2018;48:225–32. 83 Cohen JM, Burgin S. Cognitive biases in clinical decision making: a primer for the practicing dermatologist. JAMA Dermatol 2016;152:253–4. 84 Roese NJ, Vohs KD. Hindsight bias. Perspect Psychol Sci 2012;7: 411–26. 85 Banham-Hall E, Stevens S. Hindsight bias critically impacts on clinicians’ assessment of care quality in retrospective case note review. Clin Med 2019;19:16–21. 86 Kruger J, Dunning D. Unskilled and unaware of it how difficulties in recognizing one’s own incompetence lead to inflated self-as- sessments. J Pers Soc Psych 1999;77:1121–34. 87 Norman GR, Monteiro SD, Sherbino J et al. The causes of errors in clinical reasoning: cognitive biases, knowledge deficits, and dual process thinking. Acad Med 2017;92:23–30. 88 Monteiro SD, Sherbino JD, Ilgen JS et al. Disrupting diagnostic reasoning: do interruptions, instructions, and experience affect the diagnostic accuracy and response time of residents and emer- gency physicians? Acad Med 2015;90:511–7. 89 Ilgen JS, Bowen JL, McIntyre LA et al. Comparing diagnostic performance and the utility of clinical vignette-based assessment under testing conditions designed to encourage either automatic or analytic thought. Acad Med 2013;88:1545–51. 90 Sherbino J, Dore KL, Wood TJ et al. The relationship between response time and diagnostic accuracy. Acad Med 2012;87:785–91. © Royal College of Physicians 2021. All rights reserved. 47 Implicit bias in healthcare http://www.cdcfoundation.org/building-us-capacity-review-and-prevent-maternal-deaths http://www.npr.org/2017/12/07/568948782/black-mothers-keep-dying-after-giving-birth-shalon-irvings-story-explains-why?t=1565984653465 https://equitablegrowth.org/working-papers/racial-health-disparities https://blogs.BMJ.com/BMJ/2019/04/08/amali-lokugamage-maternal-mortality-undoing-systemic-biases-and-privileges http://www.apsc.gov.au/tackling-wicked-problems-public-policy-perspective 91 Monteiro SD, Sherbino J, Patel A. Reflecting on diagnostic errors: taking a second look is not enough. J Gen Intern Med 2015;30:1270–4. 92 Schwartz BD, Horst A, Fisher JA, Michels N, Van Winkle LJ. Fostering empathy, implicit bias mitigation, and compassionate behavior in a medical humanities course. Int J Environ Res Public Health 2020;17:2169. 93 Sherbino J, Kulasegaram K, Howey E, Norman G. Ineffectiveness of cognitive forcing strategies to reduce biases in diagnostic reasoning: a controlled trial. CJEM 2015;16:34–40. 94 Sibbald M, Sherbino J, Ilgen JS et al. Debiasing versus knowledge retrieval checklists to reduce diagnostic error in ECG interpreta- tion. Adv Health Sci Educ Theory Pract 2019;24:427–40. 95 Zwaan L, Monteiro S, Sherbino J et al. Is bias in the eye of the beholder? A vignette study to assess recognition of cognitive biases in clinical case workups. BMJ Qual Saf 2017;26:104–10. 96 Prakash S, Sladek RM, Schuwirth L. Interventions to improve diagnostic decision making: A systematic review and meta-anal- ysis on reflective strategies. Med Teach 2019;41:517–24. 97 Klein JG. Five pitfalls in decisions about diagnosis and prescribing. BMJ 2005;330:781–3. 98 Stiegler M, Goldhaber-Fiebert S. Understanding and preventing cognitive errors in healthcare. MedEdPORTAL 2015. www.meded- portal.org/publication/10000 [Accessed 26 February 2021]. 99 Browne AM, Deutsch ES, Corwin K et al. An IDEA: safety training to improve critical thinking by individuals and teams. Am J Med Qual 2019;34:569–76. 100 Zeidan A, Tiballi A, Woodward M, Di Bartolo IM. Targeting implicit bias in medicine: lessons from art and archaeology. West J Emerg Med 2019;21:1–3. 101 Perdomo J, Tolliver D, Hsu H et al. Health equity rounds: an inter- disciplinary case conference to address implicit bias and structural racism for faculty and trainees. MedEdPORTAL 2019;15:10858. 102 Sukhera J, Watling CJ, Gonzalez CM. Implicit bias in health professions: from recognition to transformation. Acad Med 2020;95:717–23. 103 Roberts M. Balint groups: A tool for personal and professional resilience. Can Fam Physician 2012;58:245–7. 104 Hagiwara N, Kron FW, Scerbo MW, Watson GS. A call for grounding implicit bias training in clinical and translational frame- works. Lancet 2020;395:1457–60. 105 Acholonu RG, Cook TE, Roswell RO, Greene RE. Interrupting micro- aggressions in health care settings: a guide for teaching medical students. MedEdPORTAL 2020;16:10969. 106 Royce CS, Hayes MM, Schwartzstein RM. Teaching critical thinking: a case for instruction in cognitive biases to reduce diagnostic errors and improve patient safety. Acad Med 2019;94:187–94. 107 Napier AD, Ancarno C, Butler B et al. Culture and health. Lancet 2014;384:1607–39. 108 Teal CR, Gill AC, Green AR, Crandall S. Helping medical learners recognise and manage unconscious bias toward certain patient groups. Med Educ 2012;46:80–8. 109 Atewologun D, Cornish T, Tresh F. Unconscious bias: training. Equality and Human Rights Commission, 2018. www.equalityhumanrights. com/en/publication-download/unconscious-bias-training-assessment- evidence-effectiveness [Accessed 26 February 2021]. 110 Schmidt HG, Mamede S. How to improve the teaching of clin- ical reasoning: a narrative review and a proposal. Med Educ 2015;49:961–73. 111 Royal College of Surgeons of England. Avoiding unconscious bias: A guide for surgeons. RCSEng, 2015. www.rcseng.ac.uk/ library-and-publications/rcs-publications/docs/avoiding-uncon- scious-bias [Accessed 26 February 2021]. 112 Genevieve LD, Martani A, Shaw D, Elger BS, Wangmo T. Structural racism in precision medicine: leaving no one behind. BMC Med Ethics 2020;21:17. 113 Brodie MA, Pliner EM, Ho A et al. Big data vs accurate data in health research: Large-scale physical activity monitoring, smart- phones, wearable devices and risk of unconscious bias. Med Hypotheses 2018;119:32–6. 114 Courtland R. Bias detectives: the researchers striving to make algorithms fair. Nature 2018;558:357–60. www.nature.com/ articles/d41586-018-05469-3 [Accessed 26 February 2021]. 115 Why Aren’t You A Doctor Yet? Episode 27: The internet is a repos- itory of evil (ft. Alex Fefegha). iTunes, 2019. https://podcasts. apple.com/gb/podcast/episode-27-internet-is-repository-evil-ft- alex-fefegha/id1304737490?i=1000432446587 [Accessed 26 February 2021]. 116 Ezaydi S. The unconscious bias training you’ve been told to do won’t work. Wired 2020. www.wired.co.uk/article/unconscious- bias-training-broken [Accessed 26 February 2021]. 117 Pritlove C, Juando-Prats C, Ala-leppilampi K, Parsons JA. The good, the bad, and the ugly of implicit bias. Lancet 2019;393:502–4. 118 Mukwende M, Tamony P, Turner M. Mind the gap: A handbook of clinical signs in Black and Brown skin. Black and Brown Skin, 2020. www.blackandbrownskin.co.uk [Accessed 10 September 2019]. 119 Massie JP, Cho DY, Kneib CJ et al. Patient representation in medical literature: are we appropriately depicting diversity? Plast Reconstr Surg Glob Open 2019;7:e2563. 120 Louie P, Wilkes R. Representations of race and skin tone in medical textbook imagery. Soc Sci Med 2018;202:38–42. 121 Vyas DA, Eisenstein LG, Jones DS. Hidden in plain sight: Reconsidering the use of race correction in clinical algorithms. N Engl J Med 2020;383:874–82. 122 Gopal DP, Francis R. Does race belong in the hypertension guidelines? J Hum Hypertens 2020 [Epub ahead of print]. 123 Sjoding MW, Dickson RP, Iwashyna TJ, Gay SE, Valley TS. Racial bias in pulse oximetry measurement. N Engl J Med 2020;383:2477–8. 124 McKenna H, Coghill Y, Daniel D, Morrin B. Race equality in the NHS workforce. The King’s Fund, 2019. www.kingsfund.org.uk/ audio-video/podcast/race-equality-nhs-workforce [Accessed 26 February 2021]. 125 Arvizo C, Garrison E. Diversity and inclusion: the role of uncon- scious bias on patient care, health outcomes and the work- force in obstetrics and gynaecology. Curr Opin Obstet Gynecol 2019;31:356–62. Address for correspondence: Dr Dipesh Gopal, Centre for Primary Care and Public Health, Barts and The London School of Medicine and Dentistry, Yvonne Carter Building, 58 Turner Street, London E1 2AB, UK. Email: d.gopal@qmul.ac.uk Twitter: @dipeshgopal Dipesh P Gopal, Ula Chetty, Patrick O’ Donnell et al 48 © Royal College of Physicians 2021. All rights reserved. http://www.mededportal.org/publication/10000 http://www.equalityhumanrights.com/en/publication-download/unconscious-bias-training-assessment-evidence-effectiveness http://www.rcseng.ac.uk/library-and-publications/rcs-publications/docs/avoiding-unconscious-bias https://podcasts.apple.com/gb/podcast/episode-27-internet-is-repository-evil-ft-alex-fefegha/id1304737490?i=1000432446587 http://www.nature.com/articles/d41586-018-05469-3 https://podcasts.apple.com/gb/podcast/episode-27-internet-is-repository-evil-ft-alex-fefegha/id1304737490?i=1000432446587 http://www.blackandbrownskin.co.uk http://www.kingsfund.org.uk/audio-video/podcast/race-equality-nhs-workforce mailto:d.gopal@qmul.ac.uk