Viewpoint: How to constructively engage on social media with those who post anti-GMO and anti-vax conspiracies

twitter sign
The other day, a tantalizing question appeared in my Twitter feed: “If you were writing a book about 2020, what would your first sentence be?” asked Mitch Weiss, a Pulitzer-Prize winning investigative journalist with the Associated Press. I laughed out loud at the response from Carl Bergstrom, an evolutionary biologist at the University of Washington, who has been on the front lines of the Covid-19 epidemiology dramas online and in the media.

The world is awash with bullshit, and we’re drowning in it.

This is the actual first line of Bergstrom’s 2020 book co-authored with Jevin West, Calling Bullshit: The Art of Skepticism in a Data-Driven World. There probably couldn’t be a more apt assessment of 2020 (I think I feel another book review coming on). It appears to be a field guide to help the public spot the misuse of data and prevent it in the future. This comes after Bergstrom and colleagues developed a popular course on this topic, which tackles the crucial problem of educating people and arming them with the critical thinking skills that seem to be in such short supply right now.

eyvctm vcaadl

Of course, education is the foundation for better public policy and personal decisions, and we know that teachers around the world are crucial players in the science communication arena. But we can’t wait for the future education pipeline to deal with the existing misinformation tsunami that washes over us.

The current pandemic has exposed the fact that many of our neighbors are long out of school, and unlikely to become skilled critical thinkers in time to address our present problems. Everyone’s Facebook and Twitter feeds are bombarded by waves of anti-GMO or anti-vaccine nonsense, and sometimes a combination of both. So what can be done about this in real time? Research in science communication has good news and bad news on this. The good news is that there are some strategies that might work. The bad news is that pro-science organizations might not be utilizing them to their full potential.

As a biologist who spends countless hours in the science communication trenches, I want to examine what several recent studies on this topic show, and provide some guidance to anyone who wants to help push back against the onslaught of Covid-fueled nonsense the world is experiencing.

First, the bad news 

Despite years of science communication research, it appears that major pro-science groups aren’t succeeding on Facebook. In May, a team of researchers published an analysis of the network interactions among pro-vaccination, neutral, and anti-vaccination groups. In a paper titled The online competition between pro- and anti-vaccination views, they explored the connections among discussants during a measles outbreak. Based on their results, it seems that people with anti-vaccine views are able to integrate themselves into more of the undecided and neutral communities, even though pro-vaccine pages tend to have more members. Ryan Butner, a data scientist with Pacific Northwest National Laboratory, summarized the work this way:

butner scicomm

Because I’m not on Facebook, and the researchers are not allowed to reveal exactly who the pro-vaccine pages represent, it’s difficult to understand what their unsuccessful strategies look like. It’s possible that the science communication teams are not permitted to go beyond their narrow remit and venture into groups like the “school parent association” that anti-vaxxers in the study joined, according to the researchers.

It’s likely that some of these pro-science organizations are constrained by factors that are not issues to the conspiracy theorists—such as reliance on facts and evidence. These are not as compelling as some of the fictions produced by anti-science influencers. As neurologist and popular science writer Steven Novella once wrote about  institutional systems in The Misinformation Wars, “They are like the British fighting in neat rows with their visible red uniforms, while the rebels fire at them concealed behind trees and stone walls.”

americans
Credit: Scott Monty

It may also be that pro-science groups are mischaracterizing the opposition. While some individuals are merely misinformed, a powerful and well-funded marketing frigate sits in the harbor launching cannonballs of nonsense into the popular consciousness to monetize donations, memberships, books, and videos. Butner and several colleagues made this point in a recent study: Monetizing disinformation in the attention economy: The case of genetically modified organisms (GMOs):

This means those influencers don’t operate like opponents in debate club, they operate like businesses vying for market share. They have more willingness to experiment and take the risky plays with higher potential payoffs than institutional voices typically will.

Unfortunately, as on other fronts, it seems we shouldn’t expect institutional science voices to save us. And this prompts an important question: what can a pro-science individual do to stem the tide of disinformation, at least in their own sphere?

The good news: “observational correction” as science communication

Just before everything went belly-up due to SARS-CoV-2, I attended possibly the last in-person conference for a while, MisinfoCon at the National Academy of Sciences. It brought together many of the experts who are exploring ways to stem the spread of disinformation and misinformation in academia, the mainstream press and social media arenas.

Of particular interest to us is the research examining what works on the interpersonal scale. Briefly summarized, the evidence suggests that people will sometimes reject misinformation when they see others corrected for trying to spread that same misinformation. Social scientists Leticia Bode and Emily Vraga call this phenomenon “observational correction.”

Follow the latest news and policy debates on sustainable agriculture, biomedicine, and other ‘disruptive’ innovations. Subscribe to our newsletter.

Bode gave one especially hopeful talk, presenting her work with Vraga and other colleagues titled, Wrong Again: Correction of Health Misinformation in Social Media. The researchers tested corrections on multiple platforms including Facebook, Twitter, Instagram, and YouTube for misinformation on topics such as GMOs, raw milk, vaccines, origins of a virus, and more. Some of the studies involved college students and may not reflect the broader population, some were done with more diverse survey respondents. But now several threads of evidence suggest that correction can work on “lurkers,” or people who witness the correction, even if the original poster is unmoved.

In another study, See Something, Say Something: Correction of Global Health Misinformation on Social Media, Bode and Vraga examined the impact of observational correction in relation to the Zika virus crisis. The researchers created simulated posts of misinformation about the origin of the virus and various kinds of replies with corrected content. They again found that observers who saw corrected information from quality sources demonstrated a reduced belief in misinformation compared to the control group.

There has been a great deal of concern about the so-called “backfire effect” in science communication—that is, someone will cling more tightly to misbeliefs when challenged with facts they perceive as a threat to their worldview. Although this effect seems to be less common than originally suggested, observational correction end-runs the problem. People, it seems, feel less threatened when they see someone else corrected, even if they both share the same viewpoint. That might lower barriers to the absorption of new information.

Other benefits of observational correction include immediacy and scalability. Seeing correct and credible information inhibits the misinformation from taking root. Further, more people can see the corrected information beyond just the original poster. Social corrections can work on multiple issues, and you can use your personal style to bring that information to your local connections in ways that institutional science communicators cannot.

Issue SocialMediaAttentionSpan px

The new information could be provided either by a Facebook algorithm or an individual responding to the topic. This also creates a useful role for institutional communications folks—who can provide quality resources that algorithms or individuals can use for these corrections, even if they can’t dive into the anti-science discussions themselves.

Bonus feature: if one correction is made, a second correction with related facts can also provide support and affect the reader’s perceptions. For this data see the work entitled, In Related News, That Was Wrong: The Correction of Misinformation Through Related Stories Functionality in Social Media. So when you see a friend or ally correct an item, team up and bring a related point into the discussion to reduce misperceptions on the topic.

Not all social media sites can facilitate algorithmic corrections as Facebook can, and this strategy may face other hurdles and constraints. The same tools may not work at all on Twitter or Reddit, for example. But the fact that algorithms can work when implemented properly may offer some hope for addressing misinformation on a massive scale, and therefore might be worth exploring on other platforms.

Can’t we be (un)civil?

Bode and her colleagues have also found that individuals can combat misinformation even if their preferred style may be deemed “uncivil” by some. With their most recent study, Do the right thing: tone may not affect correction of misinformation on social media, the team found that empathetic, neutral, or uncivil replies about erroneous claims all worked. As Bode summarized on Twitter, “Our biggest takeaway—as long as you provide credible information to rebut misinformation, other people viewing that correction will be convinced, whether you’re extra sympathetic, rude, or somewhere in between. So correct in whatever way feels most comfortable to you.”

In short, spending time on social media corrections is worthwhile. Friends and allies can team up for additional impact. This gives me hope that joining organized efforts of allies, like the new Stronger.Org campaign to challenge vaccine disinfo, could yield dividends at this crucial time. Also: you do you, your way. You can reach into community spaces that larger pro-science groups can’t. This won’t stop the tone police from coming after you, but now you’ll have evidence that tone police can put a cork in it.

The life ring we need to keep science literacy afloat right now may be in our hands. Or at least at our fingertips.

Mary Mangan holds a PhD in cell, molecular, and developmental biology from the University of Rochester. She co-founded OpenHelix, a company that provides awareness and training on open source genomics software tools. Follow her on Twitter @mem_somerville

{{ reviewsTotal }}{{ options.labels.singularReviewCountLabel }}
{{ reviewsTotal }}{{ options.labels.pluralReviewCountLabel }}
{{ options.labels.newReviewButton }}
{{ userData.canReview.message }}
screenshot at  pm

Are pesticide residues on food something to worry about?

In 1962, Rachel Carson’s Silent Spring drew attention to pesticides and their possible dangers to humans, birds, mammals and the ...
glp menu logo outlined

Newsletter Subscription

* indicates required
Email Lists
glp menu logo outlined

Get news on human & agricultural genetics and biotechnology delivered to your inbox.