Skip to main content

The context you need, when you need it

When news breaks, you need to understand what actually matters — and what to do about it. At Vox, our mission to help you make sense of the world has never been more vital. But we can’t do it on our own.

We rely on readers like you to fund our journalism. Will you support our work and become a Vox Member today?

Join now

Facebook’s tremendous size was its greatest asset. Now it may be its biggest problem.

The viral video of a shooting in New Zealand offered a grim reminder of tech companies’ vast reach.

Two people stand at a memorial of piled bouquets of flowers in honor of the shooting victims in mosques in Christchurch, New Zealand.
Two people stand at a memorial of piled bouquets of flowers in honor of the shooting victims in mosques in Christchurch, New Zealand.
A tribute to those killed in Christchurch, New Zealand.
Carl Court / Getty Images

Facebook COO Sheryl Sandberg has a phrase she likes to use on company earnings calls as a way to pitch Facebook’s business: Advertisers, Sandberg says, can reach a Super Bowl-size audience any day of the year.

For years, Facebook’s size and scale have been considered a real positive. Having more than 2 billion monthly users is great for business. Adding tens of millions of people to your app every quarter is a tremendous story to tell investors, advertisers, and media companies.

But when a gunman opened fire at a New Zealand mosque late last week, broadcasting video of the shooting live on Facebook for anyone to see, the platform’s enormous size became a complete and total liability. Suddenly that Super Bowl-size audience had access to something Facebook didn’t want them to see, and the company couldn’t take down fast enough the more than 1 million copies of the video uploaded by users in the next 24 hours.

The New Zealand shooting, which left at least 50 people dead, served as a horrendous reminder that Facebook’s scale — and YouTube’s and Twitter’s — are a serious problem. Facebook said its technology was able to detect and block 80 percent of the videos people uploaded of the shooting.

The problem is the 20 percent of videos that got through the net totaled some 300,000. In just 24 hours, Facebook and Instagram users tried to upload video of the shooting 1.5 million times.

At YouTube, the situation wasn’t any better. In a statement, a Google spokesperson said the video uploads were “unprecedented both in scale and speed, at times as fast as a new upload every second.” YouTube’s head of product told the Washington Post, “Every time a tragedy like this happens we learn something new, and in this case it was the unprecedented volume [of uploads].”

While the original video reached only around 4,000 people, according to Facebook, it was online long enough for copies to start spreading to other internet forums, like the messaging forum 8chan. From there, people started uploading versions of the shooting video back to Facebook and YouTube, and the tech platforms simply couldn’t keep up with the speed or volume.

Facebook has technology that can match video content to an original for quicker automatic removal. But the technology isn’t foolproof, as we learned this past week. Facebook had trouble removing all videos because people uploaded different versions of the original — for example, videos of the original recorded off a separate screen, or watermarked — that didn’t fully match.

In YouTube’s case, the company was so overwhelmed with uploads it eliminated human review for any videos flagged by its algorithms that had to do with the shooting, knowing it might take down legitimate or unrelated videos by mistake. The rules intended to provide a more thoughtful review process had to be tossed out the window.

For the past year, we’ve been talking about Facebook’s and Google’s tremendous size for other reasons: There are some, like Sen. Elizabeth Warren, who believe these companies are too dominant and should be broken up.

But their scale is not just a business problem — it can be a societal problem, too. Facebook and Google are not responsible for what the New Zealand gunman did last Friday, but it’s also not fair to ignore what service they provide when used by bad people: A free distribution mechanism for hatred and terror.

Unfortunately, this feels like a problem without a solution. Facebook and YouTube and Twitter clearly don’t yet have the technology to instantly clear their services of bad or troubling content. Even if they did, there is no way to stop content altogether without a system that vets posts before they go up — an idea that has been floated in India but is not likely to catch on here in the United States.

Instead, Facebook and YouTube and Twitter are necessarily reactive. And yes, they’ll learn to react quicker and better as time goes on and technologies improve, but it will always be a reaction.

The problem may soon get even tougher. Facebook CEO Mark Zuckerberg recently unveiled Facebook’s plan to shift toward private, encrypted messaging. If more content is shared privately instead of in a public, algorithm-fueled feed, it’s possible that videos like the one from New Zealand won’t find the kind of oxygen they need to go viral on Facebook in the future.

But encrypting content also means it will be harder for Facebook to find and remove videos like the one from the New Zealand shooter with any level of success. Someday, videos like these may not appear in your Facebook feed — they may appear in your private messaging inbox instead.

That’s not exactly the Super Bowl any of us had in mind.

This article originally appeared on Recode.net.

More in Technology

Technology
The case for AI realismThe case for AI realism
Technology

AI isn’t going to be the end of the world — no matter what this documentary sometimes argues.

By Shayna Korol
Politics
OpenAI’s oddly socialist, wildly hypocritical new economic agendaOpenAI’s oddly socialist, wildly hypocritical new economic agenda
Politics

The AI company released a set of highly progressive policy ideas. There’s just one small problem.

By Eric Levitz
Future Perfect
Human bodies aren’t ready to travel to Mars. Space medicine can help.Human bodies aren’t ready to travel to Mars. Space medicine can help.
Future Perfect

Protecting astronauts in space — and maybe even Mars — will help transform health on Earth.

By Shayna Korol
Podcasts
The importance of space toilets, explainedThe importance of space toilets, explained
Podcast
Podcasts

Houston, we have a plumbing problem.

By Peter Balonon-Rosen and Sean Rameswaram
Technology
What happened when they installed ChatGPT on a nuclear supercomputerWhat happened when they installed ChatGPT on a nuclear supercomputer
Technology

How they’re using AI at the lab that created the atom bomb.

By Joshua Keating
Future Perfect
Humanity’s return to the moon is a deeply religious missionHumanity’s return to the moon is a deeply religious mission
Future Perfect

Space barons like Jeff Bezos and Elon Musk don’t seem religious. But their quest to colonize outer space is.

By Sigal Samuel