You are here
March 24, 2021
Battling Misinformation Through Health Messaging
By Dana Litt, Ph.D., and Scott Walters, Ph.D., Department of Health Behavior and Health Systems, School of Public Health, University of North Texas Health Science Center
An old Saturday Night Live sketch has Steve Martin portraying a medieval doctor who is considering and discarding modern medical treatment in favor of crude medieval solutions like bloodletting and leeching. “Maybe we … should test those assumptions analytically,” he wonders, “through experimentation and scientific method?”
Recent efforts to reduce rates of COVID-19, including mask-wearing and vaccine adherence, have highlighted the real gaps between science-based recommendations and what people believe and do in their everyday lives. Health-related misinformation is frustratingly abundant. Some people have a clear motive to repeat falsehoods. Others spread misinformation unwittingly, using personal experience as evidence for broad claims.
Social media has provided more opportunities for people to join “closed loop” groups that spread misinformation. At the same time, age-old phenomena are much in evidence. For example, bad news travels faster than good; information that is novel is more likely to be passed on; and people who speak with greater certainty are more likely to be believed. All these factors place evidence-based health information at a disadvantage. Health information often takes time to be gathered, evolves with new evidence, and may be nuanced—the conclusions may depend on where and whom.
In a world of competing facts, it is increasingly important to adopt evidence-based communication strategies to ensure that people hear, and adopt, public health recommendations. Some best practices include:
- Understand your audience. Present information concisely, with clear attention to what’s relevant for the population, along with opportunities to find out more. Provide examples of people who have adopted that health behavior and their reasons for doing so.
- Communicate uncertainty clearly. It is better to be transparent about what’s known and not known rather than to speculate or make claims that may erode trust later. Likewise, don’t over-or-under-reassure. Lay out risk and potential consequences as clearly and calmly as possible.
- Consider the messenger. People are far more likely to adopt messages from people who are seen as likeable and trustworthy. For instance, professionals like nurses and medical doctors are perceived as trustworthy, and thus their words carry extra weight. Likeability is also often a proxy for trustworthiness. Katie Couric’s televised colonoscopy and Angelina Jolie’s New York Times op-ed on genetic testing led to national increases in behaviors that had been recommended by public health authorities for years.
- Understand information spread, particularly on social media. It is important to understand the source of potential questions and knowledge gaps. Knowing the source of misinformation is important when planning strategies to actively counter myths and misinformation.
- Be sensitive to tone. Use non-judgmental language and open-ended questions. Rolling with resistance can keep lines of communication open when countering erroneous beliefs. Strategies that invite people to find their own reasons for change, instead of strong-arming or shaming, tend to be more effective. For example, people may decide to wear masks consistently because they believe it will protect older, more vulnerable persons, rather than a broader belief that they are effective.
Given the volume of misinformation available to the public, it is sometimes necessary to address the myth “head on” in the debunking process (e.g., “Some people think…but in fact…”). Caution must be exercised when doing this, however. There is evidence that repeating falsehoods can sometimes unintentionally reinforce them. For instance, when people believe that information is being suppressed (e.g., banned books), the value of that information increases. People may also become more curious about the falsehood, reasoning that if some people believe it, there must be some truth to it. For this reason, we suggest repeating falsehoods only when it is necessary to frame the truth.
One strategy is to “sandwich” the myth between two facts since people tend to remember the beginning and end of a message more than the middle. Consider framing any misinformation rebuttals as “The facts are X, but some have falsely claimed Y. Be safe and focus on X.” Misinformation can also be framed as early, incomplete thinking, while science-based information can be framed as informed and proactive. Testimonials of people changing their minds—especially people who are viewed as likeable and trustworthy—can help, too.
Finally, one bit of good news: People often follow good health practices despite considerable ambivalence. People may change their behavior when a belief becomes too inconvenient (e.g., “It’s harder and harder to find a place to smoke these days.”) or is shared by only a few people (e.g., “I didn’t want to be the last one in my workplace to get the vaccine.”). Being fully convinced and committed is not always necessary for people to take action.