If you search “vaccination” on Pinterest this week, you’ll get a blank page. The social media site has temporarily banned related search terms in an attempt to stem the tide of misinformation fueling the anti-vaccine movement, now linked to a spate of recent and ongoing measles outbreaks around the world.
With an increasing number of people seeking health-care advice on the Internet, social media sites like Pinterest have become a destination for vaccine information—not all of it reliable. A network where users, largely women, share images by “pinning” (posting) them to themed boards, Pinterest is not known for health advice, says Jeanine Guidry, social media and health communication researcher and assistant professor at Virginia Commonwealth University. But as soon as Guidry started to study the network, she noticed rampant misinformation on vaccination, as well as more productive discussions over mental-health issues, that others had dismissed. “Most of the time, people were like, ‘Pinterest is for recipes,'” she says. Her 2015 study was one of the first to push back on this: Before the company first addressed the issue, she found that an estimated 75 percent of pins related to vaccines were negative.
Now, the company has cut off the propaganda at its source: the search bar. Pinterest started by creating what it calls a “blacklist” of “polluted” search terms, the Guardian reports. “It’s better not to serve those results than to lead people down what is like a recommendation rabbit hole,” Ifeoma Ozoma, Pinterest’s public policy and social impact manager, told the Wall Street Journal last week.
Pinterest’s decision comes after California Representative Adam Schiff put pressure on Facebook and Google for the same thing. These companies have been slow to respond to misinformation on their platform, but, legally, they can block (or not block) whatever they choose. Compared to Facebook and YouTube, which have focused on removing harmful content, Pinterest’s ban has been hailed as a more extreme approach. How far will these strategies go in slowing a public-health crisis?
People have already pointed out some holes in Pinterest’s strategy: The Guardian, for one, found alternate search terms not covered by the ban. On Reddit, the plan has earned comparison to Tumblr’s ban on adult content, which did not go smoothly. Others worry about unintended consequences, such as stifling factual information. While there is some evidence that frequent Twitter and Facebook users are more likely to be vaccinated, it’s also true that the majority of vaccine-related posts across platforms are not accurate or science-based.
Social networks can be a force for public health: Some research has found that people with less education who seek health-care advice online experience the greatest benefit. But for a growing number of hesitant parents, misinformation is out there, thanks to a small but vocal minority.
Who are these anti-vaccine users? As with increasingly polarized social media platforms, it’s likely to be a self-selecting group. It’s well documented that people who have some existing mistrust of vaccines experience confirmation bias. They’re also mainly women—not surprising, since mothers have been a vocal pillar of both the anti-vaccine and anti-genetically modified organisms movements. “We find that present-day discourses [center] around moral outrage and structural oppression by institutional government and the media, suggesting a strong logic of ‘conspiracy-style’ beliefs and thinking,” researchers wrote in a 2017 paper on the anti-vaccine movement.
On Reddit, users concerned about the spread of anti-vaccine information have noted that the naysayers are generally confined to a few niche subreddits (forums), such as r/conspiracy. But on Facebook and Pinterest, these posts reach hundreds of thousands of people. On this point, Facebook’s new approach might help: In a statement last week, the company said it would consider removing anti-vaccine content from search results and recommendations, including “Groups You Should Join.” But these reforms won’t address everything: Facebook has also allowed advertisers to promote this content for 900,000 users. In the past, similar bans have been shown to help slow the spread of fake news, with the sharing of fake news articles on Facebook dropping by 75 percent, compared to Twitter, one study found.
Some critics fear that banning these users will simply amplify their voices elsewhere. For example, one user who said they were banned from Pinterest for posting anti-vaccine content simply shifted to Reddit instead. But in general, research shows “deplatforming” (kicking harmful users or groups off a platform) works—resulting in less hate speech, according to Motherboard.
Still, it’s not enough to simply remove this content. Without science-based information to take its place, these platforms leave a “void for conspiracy theorists and fraudsters,” according to the Guardian. In response, researchers are calling for more public-health communication.
Guidry says part of the problem is that the negative posts still far outnumber content from public-health organizations: The experts need to pin too. “I’m the first to say Pinterest is not the main platform for health conversations,” she says. “But the problem is, if we don’t have a lot of health communication input, it becomes even more of an echo chamber.”
But under the current approach, which blocks all vaccine content—good or bad—they won’t be able to. This is one downside of the ban, or, as Guidry calls it, a “side effect”: Her same study found that pro-vaccine posts got more engagement, but that doesn’t necessarily translate into a long-term change in outlook. People are looking for the truth. It’s just not as prevalent.