Skip to main content

The context you need, when you need it

When news breaks, you need to understand what actually matters — and what to do about it. At Vox, our mission to help you make sense of the world has never been more vital. But we can’t do it on our own.

We rely on readers like you to fund our journalism. Will you support our work and become a Vox Member today?

Join now

A fake viral video makes Nancy Pelosi look drunk. Facebook won’t take it down.

Turns out on Facebook, just because something isn’t real doesn’t mean it’s against the rules.

Speaker Nancy Pelosi Holds Her Weekly Press Conference
Speaker Nancy Pelosi Holds Her Weekly Press Conference
A doctored video of House Speaker Nancy Pelosi has been making its way around the internet, and there’s no way to stop it.
Mark Wilson/Getty Images
Emily Stewart
Emily Stewart covered business and economics for Vox and wrote the newsletter The Big Squeeze, examining the ways ordinary people are being squeezed under capitalism. Before joining Vox, she worked for TheStreet.

A doctored video of House Speaker Nancy Pelosi that makes it seem as though she’s drunkenly slurring her words at a public event has spread across the internet in recent days, but Facebook doesn’t plan to take it down. It’s the latest example of the ease with which misinformation can spread online.

But it’s also about more than that — misinformation propagates so easily because social media platforms such as Facebook and Twitter are still figuring out their stances on what counts as prohibited content. (Facebook, for example, says just because something isn’t true doesn’t mean it’s against the rules.)

On Wednesday, Pelosi, who has been publicly quarreling with President Donald Trump this week after a failed infrastructure meeting at the White House, spoke at an event hosted by the liberal group the Center for American Progress. Soon after, an altered video of her speech that had been slowed by about 75 percent to introduce “significant distortion” popped up online — and it took off. Drew Harwell at the Washington Post flagged the video, one version of which, posted by the right-leaning page Politics WatchDog, has racked up more than 2 million views and counting.

The video continues to be widely available online. YouTube took it down, but Facebook has left it up, even though it’s acknowledged it’s fake. Trump lawyer and former New York City Mayor Rudy Giuliani tweeted and then deleted a link to the same video posted by a page called AllNews 24/7 on Thursday. The same day, Trump tweeted out a different video of Pelosi that has also been edited and cut to highlight a halting speech pattern.

Despite deleting his original tweet, Giuliani has continued to attack Pelosi on Twitter.

Facebook’s third-party fact checkers reviewed the video on Thursday evening and rated it as false, and now the company is reducing its distribution and showing additional context alongside it in the form of a related article in news feeds where it appears. But Facebook’s community standards don’t require that information posted on the platform is true. A spokesperson for Facebook declined to comment on the record for this story.

The administrator of the Politics WatchDog page where a version of the video can be found told the Guardian that it’s a “free country” and that the video will stay up.

Social media companies are still struggling to slow the spread of misinformation

Social media platforms including Facebook, Twitter, and YouTube have a track record of mishandling the policing of fake news and misinformation on their platforms.

Misinformation on social media was widespread during the 2016 election, and while platforms have taken steps to improve how they police this content, they’re far from perfect. Facebook, for example, has brought on third-party fact-checkers to evaluate and rate content, but as its stance on this Pelosi video shows, just because something is flagged as false doesn’t mean Facebook will remove it.

After terrorist bombings in Sri Lanka in April, the country’s government regulated Facebook on its own when it temporarily shut down Facebook and WhatsApp, out of concern they could be used to spread misinformation and incite more violence. Sri Lanka’s move came after Facebook failed in 2018 to prevent people in Myanmar from inciting hatred and genocide on its platform, which is something Facebook acknowledged after harsh scrutiny and pressure.

Facebook has also waffled on how to handle a host of other content on its platform, including anti-vaccination hoaxes and misinformation. Just this month, it finally banned the far-right conspiracy theorist Alex Jones and multiple other extremist figures, long after Twitter and YouTube had done so. And, of course, it fell far short in controlling misinformation and fake news as part of Russia’s political interference campaign in the 2016 election. It continues to take down fake ads tied to Russia.

In this latest case, the right has weaponized social media to spread the president’s message about his latest battle with Pelosi to his supporters.

And research suggests that the types of people who might be more inclined to watch that altered Pelosi video could also be the ones likelier to believe it. A recent study from researchers at Princeton and New York University found that conservatives and people over the age of 65 were disproportionately likely to share articles from fake news domains during the last presidential election. Eighteen percent of Republicans shared fake news, compared to under 4 percent of Democrats. Regardless of ideology, Facebook users 65 and older shared almost seven times as many fake news articles as younger users.

This should make you more nervous about deepfakes

The doctored video of Pelosi was edited in a pretty rudimentary way. But it also signals how dangerous a more sophisticated type of altered videos — so-called “deepfakes” — could be.

Deepfakes are videos and images that are altered using artificial intelligence and algorithmic simulations. They can be almost indistinguishable from real life. You might remember last April when director Jordan Peele and BuzzFeed released a deepfake video purporting to be a public service announcement from former President Barack Obama.

As Vox’s Aja Romano has explained, the technology has become common in pornography, where realistic-looking videos will swap someone else’s head onto a porn actor’s body. But it’s also showing up elsewhere, including in politics. The doctored Pelosi video isn’t even good and it’s still tricking people.

“It’s going to get worse, too, because deepfake videos, in which leaders can be made to convincingly appear to say anything you want, are likely to be entering the political fray next year,” Brian Klaas, a professor of global politics at University College London, told Vice News.

Combine the rise of deepfake videos with social media companies’ unwillingness and inability to effectively moderate misleading content, and the Pelosi video is likely just one of many more altered videos we’ll see as we head into the 2020 presidential election.


Recode and Vox have joined forces to uncover and explain how our digital world is changing — and changing us. Subscribe to Recode podcasts to hear Kara Swisher and Peter Kafka lead the tough conversations the technology industry needs today.

More in Technology

Technology
The case for AI realismThe case for AI realism
Technology

AI isn’t going to be the end of the world — no matter what this documentary sometimes argues.

By Shayna Korol
Politics
OpenAI’s oddly socialist, wildly hypocritical new economic agendaOpenAI’s oddly socialist, wildly hypocritical new economic agenda
Politics

The AI company released a set of highly progressive policy ideas. There’s just one small problem.

By Eric Levitz
Future Perfect
Human bodies aren’t ready to travel to Mars. Space medicine can help.Human bodies aren’t ready to travel to Mars. Space medicine can help.
Future Perfect

Protecting astronauts in space — and maybe even Mars — will help transform health on Earth.

By Shayna Korol
Podcasts
The importance of space toilets, explainedThe importance of space toilets, explained
Podcast
Podcasts

Houston, we have a plumbing problem.

By Peter Balonon-Rosen and Sean Rameswaram
Technology
What happened when they installed ChatGPT on a nuclear supercomputerWhat happened when they installed ChatGPT on a nuclear supercomputer
Technology

How they’re using AI at the lab that created the atom bomb.

By Joshua Keating
Future Perfect
Humanity’s return to the moon is a deeply religious missionHumanity’s return to the moon is a deeply religious mission
Future Perfect

Space barons like Jeff Bezos and Elon Musk don’t seem religious. But their quest to colonize outer space is.

By Sigal Samuel