Public trust in science is declining in the U.S.. As people become increasingly leery of scientists and scientific institutions, belief in mis- and disinformation is on the rise. Some science ‘skeptics’ are pushing back on pasteurization, eschewing vaccines, and buying into speculative, or even dangerous supplements and diet trends. You might have friends and family members who’ve fallen prey to pseudoscientific influencers or who’ve begun to parrot common, but debunked ideas about health.
Experts in misinformation and psychology say that a complex set of factors explain the worrying trend, but that there are ways to combat the spread of false ‘facts.’ Understanding the root of peoples’ belief in unfounded ideas and deploying certain communication strategies can help. Here’s what you can do.
Why do people believe scientific misinformation?
In science, simple answers are rare and information is constantly evolving as new data and research emerges. Because of that, identifying the most accurate guidance amid a sea of potential truths is difficult, says Lucy Butler, a psychology researcher who studies misinformation at Northeastern University. In contrast, mis- and disinformation (inaccurate info and intentionally misleading content, respectively), often offer straightforward explanations that “seem plausible,” she says. “Solutions that appear simple and logical are often quite appealing,” Butler adds.
[ Related: Is raw milk safe? Science has a clear answer. ]
Some well-known, common logical flaws fuel the fire. “People often fall for this naturalistic fallacy, and assume that things that are natural are better,” she says. Hence, the paleo diet fad and also the growing popularity of raw milk, despite no evidence that either carries significant benefits and well-supported knowledge of risks. Yet, lots of things that are natural are deadly and many human-made solutions save lives. Decades removed from the conditions that incited public health changes like widespread pasteurization and vaccination campaigns, cultural memory fades. This “cognitive distance” can make it hard to understand the value of scientific advances in the present, adds Butler.
Then, there’s our atomized information ecosystem. In recent years, there’s been a migration away from mainstream, centralized media which–though imperfect–is subject to an editorial process, and to platforms like TikTok, YouTube, and Facebook. In the decentralized information environment facilitated through the internet and social media, “anybody can produce content about anything, regardless of their expertise, regardless of the factual nature of that information,” Butler notes. Low quality information has proliferated, much of it peddled by bloggers and influencers who project likable personas onto the internet. Followers trust the faces and stories they see, often falling into parasocial relationships. Influencers “build these networks, and they’re completely separated from any sort of fact checking process,” she says.
Sometimes, the people parroting incorrect information aren’t aware they’re spreading falsehoods, but others craft disinformation with the intent of selling a product, says Gale Sinatra, a professor of psychology and education at the University of Southern California and author of the book Science Denial: Why it Happens and What to Do About It. As an example, she points to those pushing ivermectin as a treatment for Covid-19.
[ Related: Why do we have fevers? It’s more complicated than ‘heat kills bugs.’ ]
“In this wellness industry, millions of dollars in supplements and other things are being sold by many of these influencers,” Sinatra says. In 2023, U.S. influencers carried a $21.1 billion market value, according to analysis from Statista. Through social media algorithms and tech companies’ influencer compensation schemes based on follower count and engagement times, even simply your attention can be enough for spreaders of health and science misinformation to profit.
Though money fuels a subset of influencers, others end up sharing mis- and disinformation for free, as they come to believe it and spread it through their immediate social networks. Many find themselves down the rabbit hole of science and health pseudoscience because they’re trying to resolve a problem or question they’re personally facing, they may feel let down by mainstream medicine or experts, says Butler. Sinatra agrees. Frequently, personal health problems or negative experiences with the healthcare system prompt people to seek alternatives and predispose someone to taking in misinformation, she says. Unfortunately, in the information overload, most are ill-equipped to discern fact from fiction, Butler says.
“We don’t teach science in a way that makes how science works accessible,” notes Sinatra. People tend to learn science as a set body of knowledge, or a list of facts. But the truth is that our understanding continually advances and changes. “Science is a process of finding out information and adjudicating evidence,” she adds. “The strength of science is that it shifts with new evidence. But if you understand it as a list of facts, then you’re like ‘hey, wait a minute. They were wrong.’”
“When people [lose trust] in one facet of science, they will generalize that to other areas,” says Butler. And thus, recent events like rapidly shifting policies and recommendations over masking during the Covid pandemic has spurred distrust of medical professionals and the entire scientific endeavor writ large.
The mainstream media environment doesn’t always help. Cutting edge science and health findings or emerging research are generally nuanced, and most coverage doesn’t or can’t tell the full story within the format, says Sinatra. “Nuance is really hard to convey to the general public,” she notes. Simplifying research and excluding complicating information to avoid confusion can leave gaps that are easily filled by false facts. Most health and science findings involve statistics and risk assessment which “we do struggle as human beings to understand,” Sinatra adds.
The mathematical probabilities underlying something like a public health decision by the CDC are not intuitive. What is intuitive, in contrast, are stories, she says. Peoples’ decisions and beliefs are often unduly influenced by narrative and individual anecdotes, even if these are the exception. You may hear about one person experiencing a very rare vaccine side effect, and erroneously decide that the risk of that shot is greater than the risk of the disease it defends against.
What are the best ways to talk about science misinfo?
It can be frustrating to watch misunderstanding spread. And it’s woefully easy to dismiss or disparage anyone who expresses skepticism or concerns about scientific ideas as unhinged or unintelligent, says Sinatra. But that would be inaccurate and ineffective. It’s important to remember that anyone can fall for a falsehood, and that people generally come to their beliefs with good intentions.
“People are, almost all of the time, trying to do what’s best for themselves and their families,” says Butler. Health ends up at the center of so many conspiracies and disinformation campaigns because people care deeply about their own well-being and that of their loved ones. And in some ways, it can feel like we’re collectively losing control over our health. Rising rates of chronic disease like hypertension and diabetes, cancer among young people, and even depression–all of these real trends can spur desperation for answers and action that leads, again, to the overly-simplistic answers offered by digital wellness brands and grifters. “There’s nothing psychologically weird about wanting to be well and healthy,” says Sinatra.
So if you find yourself in a conversation with a family member who seems misinformed about a hot-button health topic, one of the most critical aspects of bridging the divide is approaching them with empathy, respect, and understanding–and without judgment, say both Sinatra and Butler.
[ Related: What science actually says about seed oils ]
Sharing personal experience and narrative, if applicable, can potentially help build trust and connection–via the same human tendency to favor familiar faces and stories that influencers exploit. But well-sourced, reliable information remains more critical, they each note.
“We’ve got pretty consistent evidence now that corrections work, especially for people who aren’t so staunch in their beliefs,” says Butler. “You might not necessarily convince everybody… but clear, accessible evidence [can] often really shift peoples’ belief,” she adds. There used to be concern within the field of misinformation psychology that corrections backfired–making people defensive and doubly-assured in their unfounded beliefs–yet newer assessments conducted by Butler and others have found that concern doesn’t bear out, she explains.
It has to be understandable, it has to be well-explained, it should account for nuance, and it has to be delivered with respect, says Sinatra, but under these conditions, accurate information can prevail, particularly when delivered by a close social connection like a friend or family member.
If you don’t know the answer to a question, don’t pretend to. It can be tempting to exaggerate risk or fabricate examples, in a heated discussion, but creating new falsehoods will only lead to more distrust. Instead, make sure you have a good strategy for evaluating information sources, and arm yourself with a few key facts. “We all need to become more digitally literate and learn how to assess information online,” says Sinatra. A critical first step to combatting belief in misinformation is for all of us to “become better fact-checkers.”