aamc.org does not support this web browser.

    Can people be immunized against disinformation?

    Debunking lies about health and medicine is vital but often occurs too late to change the minds of those who get duped. An expanding strategy, prebunking, exposes people to disinformation tactics so they can recognize and resist false messages.

    Misinformation warning chat box speech bubble background with warning sign.

    Editor’s note: This is the fourth in a series of articles about trust in medical science. Previous articles explored the forces behind the credibility crisis, how communication about science can cause public confusion, and the factors that compel people to believe medical misinformation.

    The first COVID-19 lie that the man posted on social media drew only three likes: “Coronavirus tests are not very reliable, so the official numbers of infections and deaths are based on faulty data.”

    Boring. A guide from a web community devoted to coronavirus disinformation sent the newcomer advice about how to stir things up: “Just sharing scientific-sounding content about coronavirus isn’t going to do the trick.” The man needed to build credibility for his messages, and “credibility is easy to fake.”

    The guide helped the man fabricate an expert in his next post: “Dr. Hyde T. Payne, a renowned health authority at the University of Life who has worked on the government’s COVID taskforce, says that there have been no deaths actually caused by COVID!”

    That drew more than 3,000 likes, and the likes grew exponentially with each new lie. After the man posted footage of a street riot — falsely claiming it was a protest against “Big Pharma” for suppressing evidence that radiation from mobile phone networks causes COVID-19 — he became an online conspiracy star.

    Fortunately, the exchanges were fictional; they occurred within the confines of a web-based game designed to teach people about disinformation techniques so that they can better recognize and resist them. The game, Go Viral!, employs a growing strategy to combat disinformation: Rather than debunk specific false claims after they spread, it seeks to inoculate people against such claims by prebunking them beforehand. The idea is that teaching people how information is manipulated might be more effective than correcting the misinformation.

    “Prebunking is about providing people with a weakened dose of disinformation and showing them a simulation of the types of attacks they might be facing, just as vaccines offer snapshots of the types of pathogens that might invade the immune system,” says Sander van der Linden, PhD, a social psychology professor at the University of Cambridge in the United Kingdom and co-creator of Go Viral!. “Once you know what to look for, you [the person receiving disinformation] can neutralize” those attacks.

    It’s a sort of media literacy training that attempts to overcome some drawbacks of traditional debunking against misinformation (which is erroneous) and disinformation (which is intentionally incorrect). The whack-a-mole nature of beating down deceitful tales after they appear is insufficient to have a broad impact on its own, especially when countless such tales travel almost instantly through electronic communication. For example, a study of misinformation about mpox on TikTok in May 2022 looked at 153 videos that featured conspiracy theories about the disease. Within an average of 30 hours after being posted, the videos had collectively drawn 1.5 million views.

    In addition, communications experts say that while providing facts to correct misinformation and disinformation works with those who have not firmly bought into an erroneous claim, providing facts alone leaves many people unsure of whom to believe.

    “The knowledge deficit model means there’s something wrong with the public that you’re going to correct. You’re going to give them facts and they’re going to see the light,” says Dominique Brossard, MPS, PhD, chair of the Department of Life Sciences Communication at the University of Wisconsin–Madison. “Two decades of social science research will tell you that this does not work.”

    “Debunking is especially difficult with conspiracy theories, which are often believed at an emotional, rather than rational, level,” wrote Beth Goldberg, research program manager at Jigsaw, a Google unit that confronts emerging threats to open societies. When Jigsaw interviewed dozens of conspiracy theory propagators, “we found that their deeply-held beliefs … were resistant to rational or factual counter-arguments” from experts, family, or friends.

    To be sure, various approaches are needed to combat disinformation, including doctors providing science-based information to patients, health systems posting clear facts on easy-to-find web pages, and social media companies removing blatantly untrue posts.

    But “a purely reactive mode is not appropriate,” Food and Drug Administration Commissioner Robert Califf, MD, wrote early this year in a memo to staff that prioritized finding new ways to counter health misinformation. The U.S. surgeon general, in an advisory last year, called for measures to “equip Americans with the tools to identify misinformation” when it reaches them.  

    Getting beyond just the facts

    Stoking fear. Blaming scapegoats. Exaggerating partisan grievances. Sowing doubts about scientific consensus. 

    Those are among the common tactics used in disinformation campaigns about all sorts of issues, from health to politics to culture, going back decades. In the 1960s and 1970s, for example, tobacco companies funded sham studies and ran ad campaigns to sow public doubt about the scientific consensus that smoking causes cancer. Fast forward to today, when attacking scientific consensus has been a tactic of disinformation about COVID-19 as well. Researchers at American University and the Harvard T.H. Chan School of Public Health have identified the five most common tropes (i.e., narrative themes) in COVID-19 disinformation as “corrupt elites,” “vaccine injury,” “sinister origins,” “freedom under siege,” and “health freedom.”

    Those frameworks make for influential messaging, as evidenced by their success at stirring up confusion, distrust, and conflict. But the techniques also present a vulnerability.

    Exposing people to the common tactics of disinformation messages, regardless of the issue that those messages target, is simpler and more scalable than trying to debunk a never-ending plethora of specific deceitful claims. Researchers have found that educating people about standard disinformation tactics makes them more likely to reject disinformation that they subsequently read or hear about such issues as climate change, agricultural biotechnology, and anti-vaccine conspiracies.

    “You can inoculate people with specific facts against a specific piece of misleading information,” Van der Lind notes. “But in order to scale it [the inoculation], you expose people to weakened doses of the techniques used to produce all kinds of misinformation and ways on how to spot them.”  

    Inoculation strategies have been used for over a half century to defend against various types of mental manipulation, including brainwashing, according to A Practical Guide to Prebunking Misinformation, published this year by Cambridge, Jigsaw, and BBC Media Action. The guide notes that inoculation involves two basic steps: forewarning people that they might encounter misleading information, and preemptively refuting the misinformation.

    The latest prebunking strategies that teach about disinformation tactics follow those steps through several formats, including text, video, and infographics, and are distributed mostly through social media and websites. For example, a video about scapegoating created by Truth Labs for Education — developed by Cambridge, the University of Bristol in the United Kingdom, and Jigsaw — uses a South Park cartoon clip of a town meeting where furious residents debate who to blame for an epidemic of cursing among children. They march into the street chanting en masse the culprit they identified: “Blame Canada!” The point is that disinformation doesn’t need evidence to cite a scapegoat; it just needs a vague, easy target, such as immigrants or big government. 

    In recent years researchers have developed at least three online games to teach people about misinformation and disinformation techniques: Go Viral! (about COVID-19), Harmony Square (about pitting people in a community against each other), and Bad News (about creating a fake news site about such issues as climate change). The games’ creators hope the entertainment value will draw in users who would not enroll in something that feels like a class.

    In each game, the player takes the role of someone learning to use the techniques. “Beware: misinformation is designed to trick you,” Go Viral! tells players at its introduction. “So why not walk a mile in the shoes of a manipulator to get to know their tactics from the inside?”

    The game consists of text messages between the player and a guide. The guide helps the player develop increasingly liked and shared social media messages by contriving claims, using emotionally hot words (such as “terrifying”), quoting a non-existent COVID-19 vaccine expert, and building a conspiracy theory to offer a simple explanation for the crisis and a disliked scapegoat (“Big Pharma”). The player becomes administrator of an online community of skeptics.

    More efforts needed

    Even the practice of refuting specific claims has moved more toward prebunking. Abbie Richards watches for disinformation on TikTok and warns people ahead of time about myths coming down the pike, including bogus health remedies that seem likely to gain traction among people who are understandably eager to find simple fixes to frustrating problems.

    “There is so much health misinformation, especially when it comes to weight loss and fitness,” says Richards, a research fellow at The Accelerationism Research Consortium, which studies movements to destabilize democratic societies.

    A study led by vaccine researchers at several institutions in Canada used both approaches — refuting specific information and teaching disinformation tactics — to educate people about COVID-19 vaccines. One group of participants were told that mRNA vaccines cannot change someone’s DNA and that the vaccines were studied with plenty of time to establish their safety. They were also shown some of the tactics that might be used in disinformation against the vaccines, such as stirring fear and citing fake experts.

    They then received the type of disinformation that had been refuted by the prebunking materials. Another group got only the disinformation. Participants in the first group subsequently reported significantly more intent to get vaccinated than those in the second, according to the study published last spring.

    It remains to be seen if such strategies can gain widespread traction. Because the impact of one-time educational interventions tends to fade over time, learners might need booster interventions, says A Practical Guide to Prebunking Misinformation. And getting the prebunk messages and educational materials to audiences in places where misinformation is often spread (such as through private message apps) remains a challenge, the report says.

    For now, prebunking is an innovative strategy within the range of approaches needed from governments, institutions, and individuals.

    “Health misinformation is a serious threat to public health,” the surgeon general’s advisory said last year. “Limiting the spread of health misinformation is a moral and civic imperative that will require a whole-of-society effort.”