One of the most viral conspiracy news stories in 2019 accused a coordinated group of physicians, federal organizations, and pharmaceutical companies of hiding the cure for cancer. The article appeared on NaturalNews, a website owned by Mike Adams, who also goes by The Health Ranger.
Adams traffics in conspiracies and supplements, and his sites tout products like colloidal silver mouthwashes (available for dogs, too!) and “China-free” Himalayan salt. Currently the most-viewed stories on NaturalNews include a “bombshell” declaring that Covid-19 is a plan to exterminate humanity and a story claiming world leaders are issuing lockdowns in order to institute forced vaccinations.
“[An article] comes up, and you read it and it looks credible. How does one differentiate the true from the so-called scam article?” says Paula Allen-Meares, PhD, executive director of the Office of Health Literacy at the University of Illinois-Chicago.
This is also known as disinformation — content created to intentionally deceive. And in the past five years, digital platforms have hosted a slew of disinformation, much of it centered on health topics.
“It’s unbelievable the extent to which our world has changed in the past five years. Even some of the words we’re using right now weren’t used five years ago — ‘fake news’ and ‘post-truth.’ Disinformation was something the Russians did during the Cold War,” says Nathan Walter, PhD, assistant professor of communication studies at Northwestern University.
Walter studies mass media, health messages, and correction of misinformation. “This is the big challenge of the future, trying to understand how to stop the spread of misinformation and how to educate people to be more critical consumers.”
Public health threat
During a pandemic, disinformation is an even more serious public health threat. Health disinformation can appear in the form of news stories or sources masquerading as credible experts. The ultimate goal: building an audience that will take actions such as donating money to a cause or making misinformed purchases. Meanwhile, misinformation spreads more innocently from person to person, such as an aunt sharing a false meme that contradicts expert opinion or scientific consensus.
In a 2019 Pew Research Center poll, more Americans said they consider made-up news a bigger problem than terrorism, illegal immigration, racism, and sexism. And 68% said disinformation greatly impacts people’s confidence in government institutions.
Disinformation is changing the way people interact with each other, too. Half of the people in the Pew poll said they have avoided talking with someone because they thought that person would talk about made-up news. And half of social media news consumers stopped following someone because they thought the person was posting misinformation.
No matter the form, inaccurate health information is deeply problematic. But disinformation is nothing new — it’s just more easily spread now.
Back in 1921, for example, Listerine began a 50-year ad campaign falsely claiming that people could cure colds by using the company’s mouthwash. The Federal Trade Commission eventually issued corrective advertising.
Now, though, health disinformation goes beyond advertising. It rapidly spreads online thanks in large part to social media, and people are incredibly vulnerable to it due to a mix of psychological and societal factors.
Consider the destructive claims that vaccines cause autism. That link sprouted from a since-retracted 1998 study published in The Lancet. Not only did the larger scientific community prove the theory wrong and the scientists’ process flawed, but the paper’s lead author (who stood to gain notoriety and acceptance) ended up banned from practicing medicine.
Still, two decades later, politicians have run on anti-vaccination platforms. And the U.S. has declared multiple measles outbreak emergencies, as anti-vaxxers have rejected vaccines only to further spread the disease.
What makes us vulnerable?
Part of the problem comes down to the abundance of information available daily, including the 24/7 news cycle and ubiquitous social media. In the span of human history, “This tech and media world is new to us. We’re not used to processing so much information,” Walter says.
Social networks dictate truth now, and facts and fiction can look the same on a social media newsfeed. That means another problem is not only the amount of information, but also how that information is packaged.
“We as humans are really bad at processing numbers, and health information tends to be difficult. The real information, scientific consensus, is all about relative risk and probabilities,” Walter says. “Misinformation, on the other hand, tends to be easy.”
Packaged as compelling stories, disinformation gives people something to connect with. “It’s much more interesting to read a [false] story about how Covid-19 was developed than to read the same seven tips about washing your hands from the [Centers for Disease Control and Prevention]. It also makes you feel special, like you know something others don’t,” Walter says.
Humans are wired to learn through stories. Yet, sometimes events that happen don’t have a simple cause-and-effect storyline. “We’re now in the midst of this global pandemic that affects the lives of billions of people. We don’t really know how it started,” Walter says. However, many people feel that “such a large, monumental event should have a large, monumental Act I.”
This desire for story gave rise to conspiracies about China developing Covid-19 as a biological weapon. “The alternative is that we live in a reality where not all the dots are connected, and that’s difficult,” Walter says.
It’s typical that health disinformation permeates most during moments of despair or weakness.
The Center for American Progress, a nonpartisan policy institute, issued a report in August looking at how social media platforms have handled disinformation in light of Covid-19.
According to the report, “Conspiracy theories have generally thrived in crisis, but the modern social media environment and the sudden forced movement of attention online during stay-at-home orders have been a gift to malicious actors. Leveraging prevailing uncertainty, a demand for information, and an audience stuck online, these groups have effectively deployed disinformation strategies to pervert perceptions of public opinion and warp public discourse for their own gain.”
Lacking health literacy
While psychological factors leave people unguarded against misinformation, people in the U.S. are particularly vulnerable to health misinformation because the literacy rate in the country is low. Roughly 45 million American adults (out of about 200 million) cannot read above a fifth-grade level, according to the Literacy Project Foundation. Yet, health information materials tend to be written and communicated at 10th to 12th grade reading levels, Allen-Meares says.
“If you have healthcare information that exceeds reading levels, there’s a disconnect. There could be incongruences around managing chronic illness, where to go for help,” Allen-Meares says. “Low health literacy can lead to poor health outcomes for patients, and it is financially costly for the individual and the healthcare enterprise.”
People struggle with scientific understanding, too. The lack of understanding creates a vacuum that seemingly anybody with a phone camera and internet connection can fill. This leaves the public questioning who they can trust and what qualifies someone as an expert.
In a daily Covid-19 press briefing in late October, Ngozi Ezike, MD, director of the Illinois Department of Public Health, broke down in tears as she reported the state’s latest statistics. “I’m desperate to find the message that will work,” she said. “I’m looking for someone to tell me what the message is so that we can do what it takes to turn this around.”
Eight months into the pandemic, consistent Covid-19 health messaging from federal officials has basically disappeared. “Right now, it’s just silence. But you won’t hear silence from the people pumping misinformation,” Walter says. “It used to be the case that when charlatans wanted to deceive people, they had to pretend to be experts,” Walter says.
That’s why public messaging isn’t everything. “We also need more community-based health educators who are knowledgeable about health literacy and can help convey accurate health information.” Allen-Meares says.
Inoculating against disinformation
No matter what steps social media platforms or policymakers take to address health disinformation, the problem will evolve right alongside the solutions. Yet, it’s possible to take action to limit the spread of disinformation. By huge numbers — 79% — Americans support restrictions on made-up news, according to the Pew survey.
Most importantly, policies need to hold companies accountable for either creating or enabling the spread of health disinformation, Walter says. “If there was no policy change associated with warning labels on cigarettes and who can sell them to whom, we’d still have 40% of the population smoking regular cigarettes,” Walter says.
Policies can focus on education, too. So far, 18 states have introduced laws aimed at bolstering media literacy in public schools, which seems like a key step after a 2018 Stanford report showed that most high school students couldn’t tell the difference between real and fake news stories.
“Ideally, media literacy and health literacy will just become part of the core components of what we teach kids in school, just like we decided as a society that reading, writing, and arithmetic are part of it,” Walter says.
Tech companies need to do their part, as well. For example, researchers have shown that YouTube’s video autoplay algorithm shows users increasingly extreme content in order to keep them engaged. If policies instead held platforms accountable for playing factual videos on crucial health topics, such as Covid-19, this could slow the spread of disinformation.
Social media platforms could also take steps such as warning followers of specific accounts if that account has repeatedly spread disinformation. They could place warning labels on accounts that falsely present as journalistic outlets. And they could share more data with researchers and regulators.
Because the spread of health disinformation is such a threat to public health, the World Health Organization organized the first infodemiology conference, held in June and July 2020. And the United Nations has launched the anti-misinformation initiative Verified.
You can do your part by fact-checking the health information you see on your social networks and learning to differentiate between misinformation and trusted sources. 2020 is the year of Covid-19 — and there’s too much at stake to let false claims go unchecked.
An award-winning journalist, Katie has written for Chicago Health since 2016 and currently serves as Editor-in-Chief.