An analysis of over 200 cases in Florida found that judges showed no racial bias in their sentencing decisions. Yet, the same analysis revealed that race was still influencing their decisions.
How can this be?
In 1979, the Florida Department of Corrections (DOC) realized they had a problem: once convicted, black men were more likely to receive prison sentences than white men. The DOC took action, implementing sentencing guidelines for judges, and adding an explicit statement of race neutrality to the Florida Statutes to debias decision-making. The intervention seemed to work. When the Florida DOC looked at their data again in 1997, they found that black and white inmates were now likely to be given identical sentences for identical crimes. In the department’s opinion, “the goal of racial equity … ha[d] been met” (Bales, 1997, p.3).
Psychologist Irene Blair and her collaborators, however, understood that the way our minds perceive race is complicated; and they wondered whether the DOC’s findings were truly so black and white. When the researchers analyzed over 200 mugshots of white and black male inmates, they found that, indeed, a subtle bias was still creeping in.
Even though judges were not influenced by whether an inmate was black or white, they were influenced by how black or white an inmate looked. What do we mean by this? Facial features are strongly (but imperfectly) correlated with race. Blair and colleagues discovered that judges who were successfully removing race from their sentencing decisions were still being influenced by this racial feature. The result was biased sentencing: having Afrocentric features added up to 8 months to an inmate’s sentence. This was true even after controlling for crime severity, criminal history, and attractiveness. An even more surprising result? Even white inmates with more Afrocentric features received more severe treatment. That’s how sneakily bias can shape decision-making.
What can we learn from this result?
First, the good news: once they were aware of it, the judges were able to successfully remove race bias (in the simple sense) from their decisions. Black and white defendants were receiving the same sentence on average by 1997. The Florida DOC’s actions are proof that awareness can be powerful in eradicating bias.
Next, the real news: biases are complex and nuanced. Even in situations that – on the surface – meet benchmarks for equity, biases can be expressed obliquely, entering into decisions in less visible ways. Most of us would never think to ask: “Do I have an Afrocentric facial feature bias?”, especially if the data suggest we are being race-fair. This is just one of myriad details that can fall into our blindspot. In the courtroom alone, how might we be influenced by a prosecutor’s face, the accent of an eyewitness, the attractiveness of the foreman in a jury? The list goes on and on.
To outsmart our minds, a first step is to consciously consider all the possible features that can communicate categories like “race”, “gender”, “social class”, and more. Names, addresses, hobbies, and facial features are all examples of details that can wiggle their way into (and change) our evaluations.
But we also want you to consider a second approach. Instead of assuming that you will be able to recognize and correct for all sources of bias, consider how you might automatically filter out the information you don’t need. If you’ve watched our video Strooped!, you’ve heard this message before, but it bears repeating: we can’t ignore information that’s in front of us, even when it’s irrelevant, misleading, or even wrong. It’s tempting to want more information when making a decision, but the smarter thing to do is focus only the information that makes our decisions better.
What might this look like? Blinding a resume to stop the gender of an applicant from influencing a hiring decision; switching to an online appointment system to ensure that a caller’s voice doesn’t influence their access to mental healthcare; choosing not to look at photos when selecting a tutor for your child. The temptation will be to dig deeper, but even seemingly harmless details can push us away from objectivity. So take the harder and better step of simply not knowing irrelevant information. It can make all the difference.
Outsmarting Human Minds is a program founded by Mahzarin Banaji, devoted to improving decision-making using insights from psychological science. This article was written by Olivia Kang, Kirsten Morehouse, and Mahzarin Banaji. Support for Outsmarting Human Minds comes from Harvard University, PwC, and Johnson & Johnson.