I recently watched a 2016 TED talk about confirmation bias, delivered by a researcher named Julia Galef. In it, Galef characterizes two approaches to information, calling one “the Soldier” mindset, and the other, “the Scout.”
The Soldier, Galef explains, reacts to everything with adrenaline, and is fueled by the need to protect herself and defeat the enemy. The soldier employs motivated reasoning, which allows desires and fears to shape how the Soldier interprets information, defending what works for him, and shooting down what doesn’t. As in an actual battle situation, the desire to win strongly influences the soldier’s judgment.
The Scout, on the other hand, is led by curiosity rather than motivated reasoning. In a real-life battle scenario, the Scout needs to collect information accurately, and see what truly exists — bridges, trees, reinforcements, bunkers. Outside of battle, the Scout mindset does the same, evaluating what is in front of him without making false assumptions or dismissing potentially pertinent intel.
Unlike the Scout, the Soldier operates on confirmation bias, which Psychology Today tells us “occurs from the direct influence of desire on beliefs. When people would like a certain idea/concept to be true, they end up believing it to be true… This error leads the individual to stop gathering information when the evidence gathered so far confirms the views (prejudices) one would like to be true.” (For more on confirmation bias, and how it makes us “prisoners of our assumptions,” click here: Psychology Today)
Confirmation bias is an interesting concept, and it affects every aspect of our lives. You only have to look at the political discourse on social media to witness people’s willingness to buy into, or reflexively disavow, virtually any idea or fact. I’ll sum up those 80 million arguments for you, in case you’ve been unplugged for the past twenty years: If it furthers my candidate, I’m all in. If not? It’s a load of crap, and you’re a despicable idiot for believing it.
But even beyond the political minefield, confirmation bias is widespread, and presents distinct challenges to specific undertakings. When pilots train to navigate by instruments rather than by sight, for example, they must learn to trust what the instruments say, rather than the biased information the pilot’s brain provides. In a low visibility situation, the brain may say “Up is this way,” while the instruments tell a very different story. Trusting the instruments becomes a matter of life and death: without proper instrument training, the average pilot, in low or zero visibility situations, lasts only 178 seconds before entering what is known as the “graveyard spiral.”
Read a 1st person “graveyard spiral” account from an experienced pilot here.
Why do our brains do this? Because as powerful as they are, our brains take in only snapshots of information, leaving us to fill in the gaps with ideas based on our experiences, prejudices, biases, and beliefs.
“But I’m not a pilot, so confirmation bias doesn’t apply to me.”
Well, hang on a second. It does matter, because confirmation bias is everywhere — not just in catfights and cockpits.
In the study of literature, confirmation bias stands in the way of students’ ability to see nuance, understand context, and get at authorial intention. Sadly, it keeps work from authors like Flannery O’Connor out of classrooms, because the prevailing bias among young people at the moment is that the same code of social ethics they live by now has always existed. It hasn’t. But if your bias is that is has, or that there should be no tolerance for the evolution of human society, then you read O’Connor’s work out of context, devalue it as the writings of a racist, and ignore the author’s efforts to reveal the comeuppances her flawed characters receive. (And by the way, those character flaws, more often than not, include feeling superior because of white skin.)
Confirmation bias is a problem in biblical criticism, as well. In my seminary experience, we were constantly reminded to evaluate the lens through which we were reading Scripture, and assess how it was affecting our understanding. Here’s an example of confirmation bias at work: If your bias requires that you see your own Christianity as superior to your friend’s Judaism, you’ll read much of the New Testament as a repudiation of Jesus’ own Jewish identity, rather than an effort to radically disrupt the social status quo. Perhaps the two ideas are, in fact, supported by the text, and intrinsically linked? You won’t know, if your lens allows you to see only one.
Galef doesn’t get into literature or theology, but she does point out, in a video on her website (Julia Galef Videos), a confirmation bias she calls the Sunk Cost Fallacy. In the brief video, Galef asks us to consider the faulty thinking that allows us to say, for instance, after ten years of laboring unhappily in a particular field, “Well, I’ve invested so much time into it, I might as well stay.” The Sunk Cost Fallacy is what pushes us, she notes, to read an entire book, even when we’re disenchanted by the first 100 pages. It keeps us doing what we’re doing, even when it’s wrong for us, because we don’t want to “lose” the investment of time, money, or energy we’ve already made.
In the novel I’ve been working on for the past couple of years, the protagonist, Beth, is at a crisis point, and goes to a therapist, who reminds her that fear is often the real enemy of happiness, keeping us mired in situations that don’t work. Beth asks, “Well, then how do we move past fear?”
The therapist’s advice is to begin by evaluating your options differently. Rather than assess a particular course of action by its potential downsides, she says, it makes far more sense to weigh the possible upsides.
In Galef’s scenario of sunk costs, this would mean contemplating a career change not by clinging to the time already spent, but by looking at the possible benefit: finding work that is fulfilling and meaningful to you. As Galef puts this essential question, “Do you have a good reason to stick with what you’re doing? Make the choice that leads to the better outcome for you.”In the Scout mindset, which Galef advocates, self worth isn’t tied to winning, or being right or wrong.
And this is critical, because “winning” and “being right” rarely lead to health, happiness, or even wisdom. The human condition is to screw up. To fall, and to have to dig deep to find the strength to get back up. That’s the story the Bible tells us, and it’s the plotline that underpins every great work of literature, every remarkable film, every memorable life story.
As the saying goes, “If all you have is a hammer, then everything is a nail.” But the truth, of course, is that only nails are actually nails.
Those other things, that we’re identifying as nails? They’re something else altogether – possibly something that would be much better suited to our lives.
To watch Julia Galef’s TEDtalk, click here: Julia Galef TEDtalk