Use these research-based strategies to ensure that truth prevails in your organization.
In the spring of 2020, a dangerous threat was making its way around the globe. By March, it was being spread by tens of thousands of hosts per day. Most of its victims, unfortunately, did not realize what they had encountered. Instead of taking precautions, many went on to become vectors themselves, passing it on and putting others at risk.
What was this insidious force? It was misinformation.
While misinformation, "fake news," and the "post-truth" era have been buzzwords for several years, the coronavirus pandemic has revealed just how harmful these sources of falsehood can become. After all, the virus and viral misinformation have a symbiotic relationship. Tedros Adhanom Ghebreyesus, the Director-General of the World Health Organization, put it this way: "We’re not just fighting an epidemic; we’re fighting an infodemic."
A recent study by Notre Dame faculty in the Center for Network and Data Science found that the outbreak of COVID-19 led to a stunning rise in news articles. In March, when news output on coronavirus peaked, 123,623 articles about the virus appeared in a single day. The research team discovered that less than a quarter (23.6%) of the articles published on the virus came from relatively unbiased sources. The sources that dominate the media landscape were those more likely to spread pseudoscience or even conspiracy theories.
For leaders, this cluttered media landscape poses a problem. They want to keep their employees safe and ensure that they comply with the recommendations of the latest scientific findings. But how can they speak truthfully and convincingly about facts without being drowned out by the noise of misinformation?
Research suggests that one way forward is to take the analogy between misinformation and viruses seriously. There is no simple remedy—either for coronavirus or for viral misinformation. But just as we wear facial coverings, wash our hands, and practice physical distancing to "flatten the curve," we can use science-based strategies to slow or even stop the spread of misinformation.
It all begins with understanding what drives people to create, accept, and share false information in the first place.
The Psychological Forces That Drive Misinformation
As soon as COVID-19 emerged as a major global public health crisis, so did conspiracy theories about its origins. According to some of these theories, for example, the virus was a lab-created biological weapon. Despite thorough debunking, many of these theories persist.
Although no psychologist could have predicted the shape these theories would take, many could have predicted that some conspiracy theory or another would gain traction during the pandemic. One main reason lies in what psychologists call "collective sense-making." Whenever an event shakes our sense of security and alters our lives, we have a need to work with others and "think out loud" in order to understand and give meaning to it. As part of this process, suppositions can easily snowball into pseudoscience and conspiracy theories. Conspiracy theories seem especially believable during this process because we tend to attribute large events to equally large causes.
Conspiracy theories and other forms of misinformation spread unchecked due to our tendency to affiliate and identify with groups. We gravitate toward like-minded people, a tendency that is often amplified by social media. As a result, form epistemic (i.e. knowledge) bubbles. These cause us to become what researcher Étienne Brown calls information hobbits. We become comfortable and satisfied with what we know. We rarely venture outside our own familiar sphere of knowledge. We simply pay little attention to others views and rarely revise our own beliefs.
In other cases, however, group affiliation can take a darker turn, especially through the formation of what researchers call echo chambers. The inhabitants of echo chambers become "information hooligans"; they grow hostile toward differing views and they actively seek to silence, discredit, and undermine the opposition.
What about fact checking?
In recent years, many independent fact-checking sites and services have appeared to help combat fake news. However, it may not always be effective to challenge misinformation directly. While fact-checking can help in many cases, research suggests that some people—especially information hooligans—may become even more entrenched in their beliefs when they encounter fact-checking. In addition, fact-checking is a slow process. It is no match for the sheer quantity of falsehoods online or the speed with which they spread.
So if myth-busting is not a guaranteed way to winnow truths from falsehoods, what else might we do?
Protecting Ourselves from Misinformation
As the saying goes, "an ounce of prevention is worth a pound of cure." While it is difficult to counter misinformation directly, there are promising ways to prevent it from taking hold in the first place. For example, a recent study showed that it is possible to practice a kind of "inoculation" for fake news. Just as with medical inoculation, the idea is to give a small or weakened dosage of the harmful substance. This allows the patient—or in this case the reader or viewer—to develop immunity before the true threat appears. You can get out ahead of conspiracy theories and fake news by showing examples of them and training employees or students to spot them by their look and feel. Schools, colleges, and other organizations are already using immersive games like Bad News to help users experience fake news through the process of creating it.
Foster a healthy news diet.
Another step you can take is to curate high quality fact-based information for your organization. By providing reliable information early and often, you can help members of your organization find the "signal" of truth within the "noise" of misinformation. This can also help information hobbits to step outside their comfort zone or even prevent epistemic bubbles from forming.
Nudge others toward a truth-focused mindset.
As strange as it may sound, we often interact with information without focusing on whether the information is truthful or not. We may like, share, post, or forward information because we find it entertaining, interesting, or emotionally resonant even if we have doubts about its truth. Thus, one surprisingly effective strategy for countering fake news is to simply redirect a reader's attention toward the information's accuracy. Studies have shown that a simple "nudge" asking social media users to rate an article's accuracy or asking users to pause and "explain how you know that the headline is true or false” can cause them to be more circumspect about the articles they are willing to share. And in another lab study, psychologists were able to arm study participants against repeated false claims by asking them to behave like "fact checkers."
Keep ethics in view when sharing information.
Is it immoral to share fake news? Most of us would say so, but one study revealed that we can easily lose sight of this fact. Repeated exposure to a fake news headline caused study participants to rate it as less unethical to publish and share. While major societal changes may be required to restore trust and fix what is broken in our media landscape, we can each play a part in improving it by holding one another accountable and by recognizing that we are performing a moral act each time we participate in the spread of information, even if it involves just a few clicks of our mouse.
Basol, M., Roozenbeek, J. & van der Linden, S. (2020). Good news about bad news: gamified inoculation boosts confidence and cognitive immunity against fake news. J. Cogn. 3, 2.
Effron, D. A., & Raj, M. (2020). Misinformation and morality: encountering fake-news headlines makes them seem less unethical to publish and share. Psychological science, 31(1), 75-87.
Fazio, L. K. (2020). Pausing to consider why a headline is true or false can help reduce the sharing of false news. The Harvard Kennedy School (HKS) Misinformation Review. https://doi.org/10.37016/mr-2020-009
Krieg, S. J., Schnur, J. J., Marshall, J. D., Schoenbauer, M. M., & Chawla, N. V. (2020). Pandemic Pulse: Unraveling and Modeling Social Signals during the COVID-19 Pandemic. arXiv preprint arXiv:2006.05983.