aamc.org does not support this web browser.
  • AAMCNews

    Widespread distrust in science: Is the way we communicate to blame?

    Researchers increasingly find their work misunderstood and misused, as the iterative nature of scientific discovery clashes with an explosion in public access to and interpretation of their work.

    The main icon is placed inside a glowing green circle in the center of this 100% royalty free vector illustration. It is connected to a network of sixteen additional circles with technology and computer internet communication icons on them. The background of the illustration is black with glowing green gradient.

    Editor’s note: This is the second in a series of articles about trust in science.

    In the summer of 2021, staffers at an animal supply store in Las Vegas put up a sign warning customers that they could not buy a certain medicine to de-worm horses unless they presented a photo of themselves with a horse. The reason: People were buying ivermectin under the mistaken belief that it was proven to protect humans from COVID-19, and in spite of federal warnings about side effects that include diarrhea, hypotension, and seizures.

    “You need to prove to me that you have a horse” to buy the drug, a staffer at the store told a TV station.

    The run on ivermectin — which stemmed from a flawed study and sickened hundreds of people, some to the point of hospitalization — was just one episode of pandemic confusion that spotlighted a challenge to the credibility of medical science: The way that scientists communicate about research differs from the way most people consume information about research.

    “There is a communication mismatch between us and the public,” Vera Donnenberg, PhD, associate professor of cardiothoracic surgery at the University of Pittsburgh School of Medicine, told attendees at the AAMC’s annual Learn, Serve, Lead conference last year.

    The result is that researchers are struggling to control narratives about their work, thanks in large part to old processes clashing with new forces: The iterative nature of scientific discovery (replete with nuance, uncertainty, and even reversal) against the increasing visibility of that process (through online access to research and criticism) to a public that craves definitive conclusions.

    “Our sector is overdue for a conversation about whether our model for scientific scholarly communication is fit for today’s environment, or whether it is increasingly leading to an erosion of public trust in science,” Roger C. Schonfeld, vice president of the Libraries, Scholarly Communication, and Museums Program at Ithaka S+R, a consultant to academic and cultural institutions, wrote last year.

    Embracing uncertainty

    Children learn the basic steps of the “scientific method” in school, but for adults who can’t remember them (i.e., most of us), physicist Richard Feynman summed it up this way in a 1964 lecture at Cornell University:

    “First, we guess.”

    When the laughter stopped, Feynman went on to provide a simple, one-minute explanation of how scientists develop a hypothesis based on observation, conduct experiments with no bias toward the results, then see if the experiments support the guess. The results are often mixed, and even robust findings are often contradicted by subsequent findings.

    While conceding to this reality teaches scientists (ideally) to proceed with intellectual humility — maintaining a dose of doubt about what they think they know — it can leave nonscientists frustrated over questions about health matters. People want definitive answers: Can someone just tell us if coffee causes cancer

    “It’s so confusing,” notes Stephen Joffe, MD, MPH, chair of the Department of Medical Ethics and Health Policy at the University of Pennsylvania Perelman School of Medicine (PSOM) in Philadelphia. “Coffee is good, coffee is bad. A little alcohol is good; no, alcohol is bad. I can understand why people might think, ‘These scientists have no idea what they’re talking about.’”

    (The flip-flop nature of medical research was spoofed in the 1973 comedy film “Sleeper,” when a pair of scientists chuckled over their predecessors’ failure to discover that fat, steak, and cream pies are actually health foods.)

    Scientists are left with two basic communication approaches that are difficult to blend: convey uncertainty about their findings, which can come off as waffling, or declare full confidence, which erodes people’s faith in science if that confidence is later exposed as overblown.

    “There’s a real tension if you’re a scientist trying to communicate with the public: projecting certainty and confidence, while recognizing that you are operating with a considerable degree of uncertainty,” Joffe says.

    “Scientists often caution, ‘Don’t overstate your results,’” notes Anita Desikan, MS, MPH, senior analyst at the Union of Concerned Scientists’ Center for Science and Democracy, based in Cambridge, Massachusetts. Yet she acknowledges the drawback to the cautious approach: “If you’re a lay person and you’re hearing a scientist sounding doubtful about their own research, and you compare that to someone who sounds more certain, then you may be more likely to believe that second person, despite the fact that they don’t have as much evidence to support their claims.”

    One tack not to take: Tell people to just “follow the science,” which some health leaders did when people expressed skepticism about how to avoid catching COVID-19. Intended as assurance, the phrase came off as overconfident and dismissive of doubts.

    “I doubt there’s anybody who would say we should follow the science. It’s terrible terminology,” says Sudip Parikh, PhD, CEO of the American Association for the Advancement of Science (AAAS) in Washington, D.C. “Anybody will say they want to follow the science. The question becomes, whose science?”

    Openness sows confusion

    The case of ivermectin illustrates communications challenges that are inherent in science and that have been amplified by changes in how research results are reported.

    Ivermectin comes in different doses and application methods to treat intestinal parasites in animals and in humans. After COVID-19 struck, it was one of countless medications that scientists hypothesized might block or mitigate the disease. That’s the guess-and-test of the scientific method.

    Then researchers from Benha University in Egypt posted positive results from their study for anyone to see for free, without the results going through the standard process of being reviewed by other scientists before publication. That’s where the trouble began.

    Until recently, most research findings and challenges to them were limited to scientists and those who were keyed into their work — namely, certain policymakers, academics, and journalists — primarily through scientific journals and gatherings. The average citizen had little access to these forums, for reasons that include exorbitant prices (institutions pay $20,000 a year for the Journal of Coordination Chemistry), and had little interest in the dense material. People were content with getting major science news filtered through the mainstream media.

    Today, however, anyone with an internet connection can read much of the latest scientific research for little or no cost, as increasingly more journals provide open access to their articles. On top of that comes the growing practice of content being disseminated through pre-print servers like BioRxiv and MedRxiv, which post preliminary research findings. Preprints look and read like formal journal articles, and are often accurate, but sometimes contain scientific errors that peer review is intended to catch.

    The primary motivation behind these moves, particularly in the scramble to understand COVID-19, has been to share the latest scientific research more quickly to more people. 

    “One of the success stories of these last several years has been that scientists have been able to communicate with one another, build upon one another’s theories, and work more or less instantly,” Schonfeld notes.

    But instant and widespread access has a flip side, breeding confusion about and even assaults on medical research. While major media attention used to go primarily to significant, well-vetted studies, now any study, even if it hasn’t been peer-reviewed or turns out to be significantly flawed, can essentially be delivered to anyone.

    As a result, journalists, political activists, social commentators, and well-meaning citizens all write and talk about even the most obscure reports, usually focusing on stunning findings or those that contradict previous reports. Very few within this expanded audience have been trained to assess the studies and explain differences among them.

    The findings in the Benha preprint were startling: It claimed that ivermectin could reduce COVID-19 death rates by 90%. After the study was picked up by media outfits worldwide, people began demanding prescriptions for the human version of Ivermectin from their doctors and buying up the animal version.

    The publisher retracted the study after other scientists proved that it was fundamentally flawed. But the damage was done: People continue seeking ivermectin from their doctors to prevent COVID-19, and some provide it, pointing to online studies that conflict with the majority of findings that do not show “any significant effect on outcomes of COVID-19 patients.”

    While scientists have always fact-checked each other and exposed bad research, that practice has also been changed by public access to the process. Critiques of colleagues’ studies were once conducted largely out of public view, such as on the letters pages of scientific journals. Today, scientists air their differences on websites, TV, and social media. One of the most strident disagreements during the pandemic has been whether evidence has supported closing schools and requiring students to wear masks. People can find research online to support whatever view they lean toward, with scientists weighing in on every side.

    Add to that the ill-intentioned practice by some people of cherry-picking and mischaracterizing parts of studies to push social and political agendas, which further confuses people and erodes trust in science. That’s easier to do when the instigators can find and link to the studies online.

    “The way that disinformation travels and how impactful it’s been on our society is related to the way in which these products are designed, which is for openness and scale,” said Joan Donovan, research director of the Shorenstein Center for Public Policy, in Cambridge, Massachusetts, in a panel discussion about misinformation last year. “If the business model is openness and scale, I will guarantee you everything open will be exploited.”

    The scientific process has been put on public view, with all its uncertainty, disagreement, errors, and misuse.

    “It’s been jarring for people who have been conditioned to think of science as truth and very objective to now see in full display how it changes and how scientists are debating with one another,” says Holly Fernandez Lynch, JD, MBE, assistant professor of medical ethics in the Department of Medical Ethics and Health Policy at PSOM. “It’s difficult for people to make sense of that.”

    Flaws in the process

    Open access and preprints have not caused credibility problems on their own; they’ve exacerbated flaws in how scientific research is encouraged, rewarded, and promoted. 

    The financial and career-advancing incentives for scientific research have long favored the production of novel, significant, and unequivocal results that draw attention in academia and the public. This is opposed to research that, for example, replicates previous studies to confirm or question them, or explores offbeat hypotheses that are less likely to produce eye-popping success stories.

    The incentives have always encouraged some scientists and their institutional communications shops to hype their findings. The explosion of research onto the internet and social media has expanded the opportunities and rewards for self-promotion.

    “Scientists likely feel increased pressure to hype their results because productivity metrics have taken on a greater role in scientific advancement,” Jevin West, an associate professor at the University of Washington Information School, and Carl Bergstrom, an evolutionary biologist at the University of Washington, wrote in a research article last year, “Misinformation in and about science. “Researchers commonly misstate or overstate the implications of their work … University press offices play a particularly important role in communicating science — but too frequently do so in ways that prioritize web traffic over accuracy.”

    That hurts all of science, says Janet Woodcock, MD, principal deputy commissioner of the Food and Drug Administration. “One of the things that helps undermine public confidence in medical science is overstating epidemiologic studies,” Woodcock says.

    Flawed research also hurts, and scientific publications have always grappled with that danger. Many open access articles are peer-reviewed, but that review process “is far from the guarantee of reliability it is cracked up to be,” science journalist Matt Ridley wrote in a Wall Street Journal article about science lessons from the pandemic.

    In 2020, two prestigious journals, The Lancet and The New England Journal of Medicine, retracted two peer-reviewed papers about a COVID-19 drug because of questions about the integrity of the data.

    Leaders in medical science continue exploring ways to communicate about research more clearly and to dampen the spread of misinformation. The ideas include educating scientists on how to better explain their work to the public, reining in the use of open access and preprints, and reforming how research is funded and reported. (Ways to confront the credibility challenges will be presented in a subsequent article.)

    In an article last year, Why We Must Rebuild Trust in Science, Parikh at the AAAS cautioned that “it is not enough to say the public should trust scientists because we know better or because we know more. Trust must be earned.”

    Earning it requires communicating honestly and in ways that are relevant to people’s lives, he wrote: “We must find new and better ways to connect the practice and use of science to inform and shape our communities, our country, and our world.”