Skip to main content

The context you need, when you need it

When news breaks, you need to understand what actually matters — and what to do about it. At Vox, our mission to help you make sense of the world has never been more vital. But we can’t do it on our own.

We rely on readers like you to fund our journalism. Will you support our work and become a Vox Member today?

Join now

Facebook is finally banning vaccine misinformation

Nearly a year into the pandemic, Facebook now aims to take down misinformation on vaccines overall — not just Covid-19 vaccines.

A large outdoors sign depicts the thumbs-up of a Facebook “like”.
A large outdoors sign depicts the thumbs-up of a Facebook “like”.
Facebook is expanding its enforcement against vaccine misinformation.
Josh Edelson/AFP/Getty Images
Rebecca Heilweil
Rebecca Heilweil covered emerging technology, artificial intelligence, and the supply chain.
Open Sourced logo

Almost a year into the Covid-19 pandemic, Facebook is taking its strictest stance yet against vaccine misinformation by banning it entirely. The ban won’t just apply to Covid-19 vaccine misinformation. That means, for instance, posts claiming that vaccines cause autism, or that measles can’t kill people, are no longer allowed on Facebook. At the same time, the platform will also encourage Americans to get inoculated, and will direct people to information about when it’s their turn for a Covid-19 vaccine and how to find an available dose.

These moves, part of a broader push by the company, are significant because with nearly 3 billion users, Facebook is one of the most influential social media networks in the world. And as inoculations have begun to roll out around the world, many are concerned that misinformation — including misinformation on Facebook — could exacerbate some people’s refusal or hesitancy to get vaccinated.

In a blog post published on Monday, Facebook explained that these changes are part of what it’s calling the “largest worldwide campaign” to promote authoritative information about Covid-19 vaccinations. The effort is being developed in consultation with health authorities like the World Health Organization, and will include elevating reputable information from organizations like the United Nations and various health ministries. (A list of banned vaccine claims, which was formed with the help of health authorities, is available here.) The overall approach seems similar to Facebook’s US voter registration initiative, which the company claims helped sign up several million people to participate in the November election.

Related

“A year ago, Covid-19 was declared a public health emergency and since then, we’ve helped health authorities reach billions of people with accurate information and supported health and economic relief efforts,” wrote Kang-Xing Jin, Facebook’s head of health, on Monday. “But there’s still a long road ahead, and in 2021 we’re focused on supporting health leaders and public officials in their work to vaccinate billions of people against Covid-19.”

A big caveat of the new policy is that just because Facebook says its guidelines about vaccine misinformation are changing doesn’t mean that vaccine misinformation won’t end up on the site anyway. Changing rules and enforcing rules are two different things. Despite Facebook’s earlier rules banning misinformation specifically about Covid-19 vaccines, images suggesting that coronavirus inoculations came with extreme side effects were still able to go viral on the platform, and some racked up tens of thousands of “Likes” before Facebook took them down.

A Facebook spokesperson told Recode the company will enforce its expanded rules as it becomes aware of content that violates them, regardless of whether it’s already been posted or is posted in the future. The spokesperson did not say whether Facebook is increasing its investment in content moderation given its increased scope for vaccine misinformation, but told Recode that expanding its enforcement will require time to train its content moderators and systems.

Still, Monday’s changes are significant because Facebook CEO Mark Zuckerberg, who has repeatedly defended principles of free expression, now says the company will be paying particular attention to pages, groups, and accounts on both Facebook and Instagram (which Facebook owns) that regularly share vaccine misinformation, and may remove them entirely. It’s also adjusting search algorithms to reduce the prominence of anti-vax content.

Like other enforcement actions Facebook has taken — on everything ranging from the right-wing, anti-Semitic QAnon conspiracy theory to incitements of violence posted by Donald Trump — some say the company’s move is too delayed. “This is a classic case of Facebook acting too little, too late,” Fadi Quran, a campaign director at the nonprofit Avaaz who leads its disinformation team, told Recode. “For over a year Facebook has sat at the epicenter of the misinformation crisis that has been making this pandemic worse, so the damage has already been done.” He said that at this point, much more needs to be done to address users who have already seen vaccine misinformation.

Facebook’s announcement comes as major technology platforms wrestle with their role in the Covid-19 crisis. Back in the fall, experts warned that social media platforms were walking a delicate line when it comes to the global vaccine effort: While social networks should promote accurate information about Covid-19 inoculations, they said, platforms must also leave room for people to express honest questions about these relatively new vaccines.

“We have a new virus coupled with a new vaccine coupled with a new way of life — it’s too much newness to people,” Ysabel Gerrard, a digital sociologist at the University of Sheffield, told Recode at the time. “I think the pushback against a Covid-19 vaccine is going to be on a scale we’ve never seen before.”

How well Facebook will enforce its new rules, or how many people the platform will help get vaccinated, is unclear. The changes it announced on Monday come after experts have repeatedly warned about Facebook’s role in promoting anti-vaccine conspiracy theories. For years, researchers have flagged Facebook as a platform where wrong and misleading information about vaccines — including the idea that vaccines can be linked to autism — have proliferated.

Open Sourced is made possible by Omidyar Network. All Open Sourced content is editorially independent and produced by our journalists.

More in Technology

Technology
The case for AI realismThe case for AI realism
Technology

AI isn’t going to be the end of the world — no matter what this documentary sometimes argues.

By Shayna Korol
Politics
OpenAI’s oddly socialist, wildly hypocritical new economic agendaOpenAI’s oddly socialist, wildly hypocritical new economic agenda
Politics

The AI company released a set of highly progressive policy ideas. There’s just one small problem.

By Eric Levitz
Future Perfect
Human bodies aren’t ready to travel to Mars. Space medicine can help.Human bodies aren’t ready to travel to Mars. Space medicine can help.
Future Perfect

Protecting astronauts in space — and maybe even Mars — will help transform health on Earth.

By Shayna Korol
Podcasts
The importance of space toilets, explainedThe importance of space toilets, explained
Podcast
Podcasts

Houston, we have a plumbing problem.

By Peter Balonon-Rosen and Sean Rameswaram
Technology
What happened when they installed ChatGPT on a nuclear supercomputerWhat happened when they installed ChatGPT on a nuclear supercomputer
Technology

How they’re using AI at the lab that created the atom bomb.

By Joshua Keating
Future Perfect
Humanity’s return to the moon is a deeply religious missionHumanity’s return to the moon is a deeply religious mission
Future Perfect

Space barons like Jeff Bezos and Elon Musk don’t seem religious. But their quest to colonize outer space is.

By Sigal Samuel