A father and his son are in a car accident. The father dies at the scene and the son, badly injured, is rushed to the hospital. In the operating room, the surgeon looks at the boy and says, ‘I can’t operate on this boy. He is my son.’ How can this be?
The authors continue:
If your immediate reaction is puzzlement, that’s because automatic mental associations caused you to think ‘male’ on reading ‘surgeon.’ The association surgeon = male is part of a stereotype. In this riddle, that stereotype works as the first piece of a mindbug. The second piece is an error in judgment—in this case a failure to delay in figuring out that the surgeon must be the boy’s mother.
Mindbugs, according to the authors, are “ingrained habits of thought that lead to errors in how we perceive, remember, reason, and make decisions.” These mindbugs cause automatic mental reactions, or hidden biases.
Biases are the stories you tell yourself about a particular social group. At some level, these biases are useful because they allow our brains to save mental energy, but we also remain oblivious to their power at large.
Biases are created by culture—stories, jokes, propaganda, media—and they lead us to believe certain things: Asians are good at math; Boston drivers are aggressive; New Yorkers are always in a rush; anyone who wears funky glasses and a cool hairstyle is a designer of sorts; African Americans are better athletes, and so on. Biases make us forget that women can be surgeons and men can be nurses, too.
When you repeatedly encounter a similar story because of your environment and culture, those stories become biases, and those biases become anchors to your perception and decision-making processes. Bias is a core catalyst for your behavior and attitude towards a particular group. Most of the time, you aren’t even aware of how you’re behaving or why because the people around you are doing the same thing.
These days, memes on Twitter or Instagram are shaping biases in the form of pictures or GIFs, either confirming something you already believed or introducing you to a new story. When you encounter that new story elsewhere, relentlessly, it becomes hard to ignore, and thus, it becomes “normal,” for good or for bad, whether it’s accurate or not.
Banaji and Greenwald explain how the word “stereotype” was introduced into our lexicon:
Walter Lippmann gave stereotype its current meaning in 1922, when he used it to refer to ‘pictures in our head’ that portray all members of a group as having the same attributes—often not very attractive attributes. Lippmann suggested that this fixed mental picture of a group would enter our thoughts each time we encountered someone from that group.”
The problem is that these stereotypes become automatic, and when you encounter a bad driver in Boston, it confirms your bias and makes you feel better that your perception of Boston drivers was right. If these stereotypes are unchallenged, we carry these borrowed impressions and unchecked stories, allowing them to calcify and become pillars of influence in how we lead our lives.
You can actually measure and see for yourself just how biased you are towards a particular group. Banaji and Greenwald, with their colleague Brian Nosek, co-developed the Implicit Association Test (IAT), which was designed to detect the hidden biases in our minds. You can take the test here. Over fourteen million people have taken it on the Harvard site, and the results that they’ve found between Whites and Blacks induces a pause-giving moment to reflect and challenge the comfort of our perception [emphasis mine]:
As data from many respondents show, 70 percent or more of the people who take this test have greater difficulty with Sheet B, which pairs Whites with weapons, than with Sheet A, which pairs Blacks with weapons. Analyses of more than eighty thousand race-weapons IATS completed at implicit.harvard.edu yielded three important results:
First, the automatic Black = weapons association is much stronger among all groups who took the test—White, Asian, Hispanic, and even African American—than is suggested by surveys that asked questions about this association.
Second, the size of this automatic stereotype varies noticeably by groups—it is largest in Whites and Asians, next largest in Hispanics, and the smallest in African Americans. But even African Americans show a modest black=weapons stereotype.
Third, comparing the results of the two kinds of tests—reflective self-report and automatic stereotype—reveals another interesting fact about who carries the stereotype. The higher the education level, the lower the endorsement of the association between Blacks and weapons on the reflective self-report answers. However, on the test of automatic stereotypes, the IAT, education levels matter not a whit. Those with greatest education carry as strong an implicit Black = weapons stereotype as do those with least education.”
Our culture’s profound failure in integrating this knowledge early in education and parenting is the root problem of the chaos we’re seeing in the world today. A world without stereotypes is impossible—our minds are structured to categorize for effortless thinking, and naive realism tells us that we all believe we see the world for what it really is. But a world where we acknowledge these stereotypes, familiarize ourselves with them so we can be self-aware and rational, and have critical conversations that cultivate a growing understanding that leads to lasting change is not impossible.
It’s uncomfortable to realize just how biased we really are, but change requires discomfort—that’s why they’re called growing pains. The stories we tell ourselves about a particular group of people can change if we seek to know and understand that group and if we challenge the assumptions and beliefs that we’ve adopted rather than experienced on our own.
More than an opportunity for change, it’s an obligation.