Are We Too Dumb for Democracy? The Logic Behind Self-Delusion
Stay up to date with the latest headlines via email.
A recent cognitive study, as reported by the Boston Globe, concluded that:
Facts don’t necessarily have the power to change our minds. In fact, quite the opposite. In a series of studies in 2005 and 2006, researchers at the University of Michigan found that when misinformed people, particularly political partisans, were exposed to corrected facts in news stories, they rarely changed their minds. In fact, they often became even more strongly set in their beliefs. Facts, they found, were not curing misinformation. Like an underpowered antibiotic, facts could actually make misinformation even stronger.
In light of these findings, researchers concluded that a defense mechanism, which they labeled “backfire”, was preventing individuals from producing pure rational thought. The result is a self-delusion that appears so regularly in normal thinking that we fail to detect it in ourselves, and often in others: When faced with facts that do not fit seamlessly into our individual belief systems, our minds automatically reject (or backfire) the presented facts. The result of backfire is that we become even more entrenched in our beliefs, even if those beliefs are totally or partially false.
“The general idea is that it’s absolutely threatening to admit you’re wrong,” said Brendan Nyhan, the lead researcher of the Michigan study. The occurrence of backfire, he noted, is “a natural defense mechanism to avoid that cognitive dissonance.”
The conclusion made here is this: facts often do not determine our beliefs, but rather our beliefs (usually non-rational beliefs) determine the facts that we accept. As the Boston Globe article notes:
In reality, we often base our opinions on our beliefs, which can have an uneasy relationship with facts. And rather than facts driving beliefs, our beliefs can dictate the facts we chose to accept. They can cause us to twist facts so they fit better with our preconceived notions. Worst of all, they can lead us to uncritically accept bad information just because it reinforces our beliefs. This reinforcement makes us more confident we’re right, and even less likely to listen to any new information. And then we vote.
Despite this finding, Nyhan claims that the underlying cause of backfire is unclear. “It’s very much up in the air,” he says. And on how our society is going to counter this phenomena, Nyhan is even less certain.
These latter unanswered questions are expected in any field of research, since every field has its own limitations. Yet here the field of psychoanalysis can offer a completion of the picture.
Disavowal and Backfire: One and the Same
In an article by psychoanalyst Rex Butler, Butler independently comes to the same conclusion as the Michigan Study researchers. In regards to facts and their relationship to belief systems (or ideologies), Butler says that:
there is no necessary relationship between reality and its symbolization … Our descriptions do not naturally and immutably refer to things, but … things in retrospect begin to resemble their description. Thus, in the analysis of ideology, it is not simply a matter of seeing which account of reality best matches the ‘facts’, with the one that is closest being the least biased and therefore the best. As soon as the facts are determined, we have already – whether we know it or not – made our choice; we are already within one ideological system or another. The real dispute has already taken place over what is to count as the facts, which facts are relevant, and so on.
This places the field of psychoanalysis on the same footing as that of cognitive science, in regards to this matter. But where cognitive studies end, with Nyhan’s question about the cause of backfire, psychoanalysis picks up and provides a possible answer. In fact, psychoanalysts have been publishing work on backfire for decades; only psychoanalysis refers to backfire by another name: “disavowal”. Indeed, these two terms refer to one and the same phenomena.