Gleb Tsipursky

The Delta surge reveals the folly of businesses' rush to return to the office

With vaccine effectiveness against the Delta variant dropping to 39%, it is absurd to pursue a normal office return. Make no mistake about the danger: the Delta surge is forecast to grow much worse in the next few months. Indeed, the CDC is asking vaccinated people to wear masks and moving toward recommending booster shots.

Yet many large companies and mid-size firms, along with the federal government, are forcing employees who successfully worked from home during the height of the pandemic to return to the office. Over a third have already returned and most of the rest are slated to return by the end of summer or early-mid Fall, when the schools will reopen and Delta cases will soar.

Why is the federal government, and household name companies — Apple, Xerox, JPMorgan, Goldman Sachs, American International Group, and Abbott Laboratories — putting the health of their employees at serious risk? And why are they practically guaranteeing mass employee flight as part of the Great Resignation prompted by pressure to return to the office?

After all, in-depth surveys of employee preferences on returning to the office — even before the Delta surge — showed that about half were willing to quit if not given their preferred work arrangements. The surveys revealed a quarter to a third of employees wanted full-time remote work, while over half wanted a hybrid schedule of a day or two in the office.

Many already quit due to employer plans to force them back to the office. Fears over the Delta surge will undoubtedly prompt even more to quit rather than risk their health due to breakthrough infections.

The reason that so many large employers fail to listen to the concerns of employees, whether about their work preferences or their health, stems from the wishful thinking of dangerous judgment errors called cognitive biases. These mental blindspots lead to poor strategic and financial decision-making when evaluating options. They cause leaders to go with their gut and follow their personal preferences instead of relying on best practices on returning to the office.

The biggest threat in underestimating the Delta surge comes from the normalcy bias. This dangerous judgment error leads us to underestimate the likelihood and impact of disruptive events.

Consider that we already had clear evidence of US Covid cases caused by Delta beginning to surge in early June. We also had clear evidence already in May of a Delta-caused explosion of cases in countries with higher rates of vaccination than in the US, such as the UK and Israel.

Big employers pride themselves on making data-driven decisions. They had the data: it was right under their noses. And they didn't have the excuse of not knowing about the danger of an explosive growth in COVID cases. Yet despite the clear and present danger of Delta, they insisted on forcing their employees back to the office.

Another major mental blindspot at play, the planning fallacy, causes leaders to make overly optimistic plans and refuse to change them despite new evidence showing their folly. After all, changing your plans implies that you got them wrong in the first place. Weak leaders frequently refuse to admit they are mistaken and acknowledge the need to change their plans. By contrast, strong leaders show the courage of changing their minds when new evidence shows a need to pivot.

Fortunately, a small number of organizations are showing the courage to revise their plans. Yet many of these revisions are band-aids rather than true pivots.

Apple, for instance, delayed the return to the office from September to October. Yet this one-month delay shows that Apple just doesn't get it. Not only is the Delta variant slated to peak in October, but there's a bigger issue at hand. Similarly, the federal government is calling for its employees to be vaccinated or face regular COVID resting.

Apple and the other large employers forcing employees back to the office need to face the reality that vaccine immunity wanes in a few months. At the same time, new variants are emerging, such as Delta Plus, which might be even worse than the Delta variant. Vaccine mandates aren't going to cut it, nor are temporary delays.

Delta is a short-term issue with a long-term tail of multiple similar scenarios. Not facing this patently obvious unpleasant reality stems from a cognitive bias scholars call the ostrich effect, after the mythical notion that ostriches bury their heads in the sand when facing danger. Research suggests that denying negative reality is a top cause of CEOs getting fired, cited by 23% of members of boards of directors that terminated their CEOs.

Overcoming normalcy bias, planning fallacy, and the ostrich effect in the return to the office requires relying on research-based best practices. That means a mainly hybrid model of a day or two in the office for most employees, who should be able to move easily to full-time remote work when needed. A substantial minority of employees should work full-time remotely, if they wish to do so and demonstrate effectiveness. This best-practice setup maximizes the benefits of in-office collaboration for those employees who benefit from it most, while retaining top talent that would leave if not permitted full-time remote work, and creates a company culture, systems, and processes that facilitate full-time remote work when needed for all employees.

Dr. Gleb Tsipursky is an internationally-renowned thought leader in future-proofing and cognitive bias risk management, serves as the CEO of the boutique future-proofing consultancy Disaster Avoidance Experts, and is a best-selling author of several books, including Returning to the Office and Leading Hybrid and Remote Teams: A Manual on Benchmarking to Best Practices for Competitive Advantage.

Cognitive neuroscientist explains the 'illusory truth effect' — and how you can defend yourself from election disinformation

This article is excerpted from: Gleb Tsipursky and Tim Ward, Pro Truth: A Pragmatic Plan to Put Truth Back Into Politics (Changemakers Books, 2020).

Whenever you hear something repeated, it feels more true when you hear it repeated. In other words, repetition makes any statement seem more true. So anything you hear will feel more true each time you hear it again.

Do you see what I did there? Each of the three sentences above conveyed the same message. Yet each time you read the next sentence, it felt more and more true. Cognitive neuroscientists like myself call this the "illusory truth effect."

Go back and recall your experience reading the first sentence. It probably felt strange and disconcerting, perhaps with a tone of outrage, as in "I don't believe things more if they're repeated!"

Reading the second sentence did not inspire such a strong reaction. Your reaction to the third sentence was tame by comparison.

Why? Because of a phenomenon called "cognitive fluency," meaning how easily we process information. Much of our vulnerability to deception in all areas of life – including to fake news and misinformation – revolves around cognitive fluency in one way or another.

Unfortunately, such misinformation can swing major elections, such as the 2016 Presidential election. Fortunately, we can take a number of steps to address misinformation and make our public discourse and political system more truthful, in the 2020 election and beyond.

The Lazy Brain

Our brains are lazy. The more effort it takes to process information, the more uncomfortable we feel about it and the more we dislike and distrust it.

By contrast, the more we like certain data and are comfortable with it, the more we feel that it's accurate. This intuitive feeling in our gut is what we use to judge what's true and false.

Yet no matter how often you heard that you should trust your gut and follow your intuition, that advice is wrong. You should not trust your gut when evaluating information where you don't have expert-level knowledge, at least when you don't want to screw up. Structured information gathering and decision-making processes help us avoid the numerous errors we make when we follow our intuition. And even experts can make serious errors when they don't rely on such decision aids.

These mistakes happen due to mental errors that scholars call "cognitive biases." The illusory truth effect is one of these mental blindspots; there are over 100 altogether. These mental blindspots impact all areas of our life, from health and politics to relationships and even shopping.

The Danger of Cognitive Fluency and Illusory Truth

We already make plenty of mistakes by ourselves, without outside intervention. It's especially difficult to protect ourselves against those who know how to manipulate us. Unfortunately, the purveyors of misinformation excel at exploiting our cognitive biases to get us to buy into fake news.

Consider the illusory truth effect. Our vulnerability to it stems from how our brain processes novel stimuli. The first time we hear something new to us, it's difficult to process mentally. It has to integrate with our existing knowledge framework, and we have to build new neural pathways to make that happen. Doing so feels uncomfortable for our lazy brain, so the statement that we heard seems difficult to swallow to us.

Next time we hear that same thing, our mind doesn't have to build new pathways. It just has to go down the same ones it built earlier. Granted, those pathways are little more than trails, newly laid down and barely used. It's hard to travel down that newly-established neural path, but much easier than when your brain had to lay down that trail. As a result, the statement is somewhat easier to swallow.

Each repetition widens and deepens the trail. Each time you hear the same thing, it feels more true, comfortable, and intuitive.

Does it work for information that seems very unlikely? Science says yes! Researchers found that the illusory truth effect applies strongly to implausible as well as plausible statements.

What about if you know better? Surely prior knowledge prevents this illusory truth! Unfortunately not: even if you know better, research shows you're still vulnerable to this cognitive bias, though less than those who don't have prior knowledge.

Sadly, people who are predisposed to more elaborate and sophisticated thinking – likely you, if you're reading the article – are more likely to fall for the illusory truth effect. And guess what: more sophisticated thinkers are also likelier than less sophisticated ones to fall for the cognitive bias known as the bias blind spot, where you ignore your own cognitive biases. So if you think that cognitive biases such as the illusory truth effect don't apply to you, you're likely deluding yourself.

That's why the purveyors of misinformation rely on repeating the same thing over and over and over and over again. They know that despite fact-checking, their repetition will sway people, even some of those who think they're invulnerable. In fact, believing that you're invulnerable will make you more likely to fall for this and other cognitive biases, since you won't be taking the steps necessary to address them.

Other Important Cognitive Biases

What are some other cognitive biases you need to beware? If you've heard of any cognitive biases, you've likely heard of the "confirmation bias." That refers to our tendency to look for and interpret information in ways that conform to our prior beliefs, intuitions, feelings, desires, and preferences, as opposed to the facts.

Again, cognitive fluency deserves blame. It's much easier to build neural pathways to information that we already possess, especially that around which we have strong emotions; it's much more difficult to break well-established neural pathways if we need to change our mind based on new information. Consequently, we instead look for information that's easy to accept, that which fits our prior beliefs. In turn, we ignore and even actively reject information that doesn't fit our beliefs.

Moreover, the more educated we are, the more likely we are to engage in such active rejection. After all, our smarts give us more ways of arguing against new information that counters our beliefs. That's why research demonstrates that the more educated you are, the more polarized your beliefs will be around scientific issues that have religious or political value overtones, such as stem cell research, human evolution, and climate change. Where might you be letting your smarts get in the way of the facts?

Our minds like to interpret the world through stories, meaning explanatory narratives that link cause and effect in a clear and simple manner. Such stories are a balm to our cognitive fluency, as our mind constantly looks for patterns that explain the world around us in an easy-to-process manner. That leads to the "narrative fallacy," where we fall for convincing-sounding narratives regardless of the facts, especially if the story fits our predispositions and our emotions.

You ever wonder why politicians tell so many stories? What about the advertisements you see on TV or video advertisements on websites, which tell very quick visual stories? How about salespeople or fundraisers? Sure, sometimes they cite statistics and scientific reports, but they spend much, much more time telling stories: simple, clear, compelling narratives that seem to make sense and tug at our heartstrings.

Now, here's something that's actually true: the world doesn't make sense. The world is not simple, clear, and compelling. The world is complex, confusing, and contradictory. Beware of simple stories! Look for complex, confusing, and contradictory scientific reports and high-quality statistics: they're much more likely to contain the truth than the easy-to-process stories.

Fixing Our Brains

Unfortunately, knowledge only weakly protects us from cognitive biases; it's important, but far from sufficient, as the study I cited earlier on the illusory truth effect reveals.

What can we do? You can use decision aid strategies to address cognitive biases, not only to defend yourself from misinformation, but also overcome the tide of misinformation destroying our democracy.

One of the most effective strategies is to build up a habit of automatically considering alternative possibilities to any claim you hear, especially claims that feel comfortable to you. Since our lazy brain's default setting is to avoid questioning claims, which required hard thinking, it really helps to develop a mental practice of going against this default.

Be especially suspicious of repeated claims that favor your side's positions without any additional evidence, which play on the illusory truth effect and the confirmation bias combined. Make sure to fact-check them with reliable fact-checking organizations, rather than accepting them because it feels good and right to do so.

Another effective strategy involves cultivating a mental habit of questioning stories in particular. Whenever you hear a story, the brain goes into a listening and accepting mode. Remember that it's very easy to cherry-pick stories to support whatever position the narrator wants to advance. Instead, look for thorough hard numbers, statistical evidence, and peer-reviewed research to support claims.

More broadly, you can make a personal commitment to the twelve truth-oriented behaviors of the Pro-Truth Pledge by signing the pledge at ProTruthPledge.org. All of these behaviors stem from cognitive neuroscience and behavioral economics research in the field called debiasing, which refers to counterintuitive, uncomfortable, but effective strategies to protect yourself from cognitive biases. Peer-reviewed research has shown that taking the Pro-Truth Pledge is effective for changing people's behavior to be more truthful, both in their own statements and in interactions with others.

These quick mental habits will address the most fundamentally flawed aspects of our mind's tendency to accept misinformation.

Dr. Gleb Tsipursky is a cognitive neuroscientist and behavioral economist passionate about promoting truth, rational thinking, and wise decision-making. A civic activist and philanthropist, he's the volunteer President of the Board of the nonprofit Intentional Insights and co-founded the Pro-Truth Pledge. Professionally, he serves as the CEO of the consulting, coaching, speaking, and training firm Disaster Avoidance Experts. He's a best-selling author of a number of books, most notably Pro Truth: A Pragmatic Plan to Put Truth Back Into Politics (Changemakers Books, 2020), as well as the national bestsellers The Truth Seeker's Handbook: A Science-Based Guide and Resilience: Adapt and Plan for the New Abnormal of the COVID-19 Coronavirus Pandemic. He published over 550 articles and gave over 450 interviews for prominent venues such as Time, Scientific American, Psychology Today, The Conversation, Inc. Magazine, CNBC, CBS News, NPR, Newsweek, and elsewhere. His expertise stems from his research background with over 15 years in academia, including 7 years as a professor at the Ohio State University, where he published dozens of peer-reviewed articles in academic journals such as Behavior and Social Issues and Journal of Social and Political Psychology.

Neuroscience reveals the mental blindspots that can become deadly in a pandemic

As the vast majority of companies rush to reopen and people rush back to public life, they’re falling into the trap of “getting back to normal.”  They’re not realizing we’re heading into a period of waves of restrictions once again, due to many states reopening too soon.

Keep reading... Show less

How to Deprogram Truth-Denying Trump Voters

How did Donald Trump win, when he used so many misleading statements and outright deceptions? Couldn’t people see through them? As an expert in brain science, I want to share why his followers fell for his lies and what can be done to address this situation in the future.

Keep reading... Show less

Get Donald Trump Out of My Brain: The Neuroscience That Explains Why He’s Running Away With the GOP

Donald Trump will be the Republican presidential nominee — at least, if all the media attention paid to his candidacy has anything to do with primacy and caucus results. A recent analysis found that he has received over 50 percent of the summer media coverage of Campaign 2016.

Keep reading... Show less
BRAND NEW STORIES

Don't Sit on the Sidelines of History. Join Alternet All Access and Go Ad-Free. Support Honest Journalism.