Skip to main content

The context you need, when you need it

When news breaks, you need to understand what actually matters — and what to do about it. At Vox, our mission to help you make sense of the world has never been more vital. But we can’t do it on our own.

We rely on readers like you to fund our journalism. Will you support our work and become a Vox Member today?

Join now

Mark Zuckerberg wants you — and your government — to help him run Facebook

Remember when Silicon Valley’s giants scoffed at regulation? Now they see it as a protective shield.

Facebook CEO Mark Zuckerberg looking at his cellphone.
Facebook CEO Mark Zuckerberg looking at his cellphone.
Facebook CEO Mark Zuckerberg.
Drew Angerer / Getty Images
Peter Kafka
Peter Kafka covered media and technology, and their intersection, at Vox. Many of his stories can be found in his Kafka on Media newsletter, and he also hosts the Recode Media podcast.

Mark Zuckerberg built one of the most powerful companies in the world. Now he says he needs help running it.

In a Washington Post op-ed, the Facebook CEO is calling on “governments and regulators” around the world to help rein in the internet, and his own company.

“By updating the rules for the Internet, we can preserve what’s best about it — the freedom for people to express themselves and for entrepreneurs to build new things — while also protecting society from broader harms,” Zuckerberg writes.

Zuckerberg goes on to ask for new regulation addressing four topics: “harmful content, election integrity, privacy, and data portability.” But the bigger point is that he’s asking for regulation at all: For years, Silicon Valley’s tech leaders assumed that governments and regulators were anachronistic speed bumps to be avoided.

What’s changed, of course, is that governments and regulators around the world are now intent on creating new rules around the internet (or, at least, saying that they’re intent on doing so).

And Facebook would rather get out in front of it by suggesting the kinds of rules it would like to see implemented.

Facebook isn’t alone in this mindset. Lots of Silicon Valley’s biggest companies assume there are new regulations coming and are working with regulators to get the rules they think will help themselves. They don’t have to love the rules, as long as the rules give them a clear framework that spells out what they’re responsible for — and what they don’t need to do.

An obvious example, reiterated by Zuckerberg in his op-ed: Getting more countries to adopt the European Union’s General Data Protection Regulation. It’s not so much that Facebook et al think GDPR is particularly good at protecting consumer privacy. But they know how to work with GDPR, and they would rather have a consistent set of laws to follow instead of a patchwork of country-by-country laws.

Some of this may happen organically. In the US, for instance, Silicon Valley leaders expect individual states to enact their own regulations around internet privacy and other issues, and assume those rules will prompt the federal government to eventually create its own nationwide rules — which is what Silicon Valley would prefer.

On the other hand, it’s very hard to imagine a global consensus around … anything, let alone rules governing “distribution of harmful content,” as Zuckerberg floats here.

And there are plenty of people in the US government who are raising eyebrows at Zuckerberg’s ask. Here, for instance, is the chief of staff of the Federal Communications Commission, responding on Saturday:

Regardless of how practical it is, Facebook’s impulse to ask people who don’t run Facebook for help running Facebook looks like the new normal for Facebook.

Facebook — along with all of the other Silicon Valley companies that depend on individual users for content or inventory — has always asked other people to police their platforms. If someone uploads a video or song you own onto the site, it’s up to you to tell Facebook to take it down. And if you think that Pulitzer Prize-winning photograph of a nude Vietnamese girl running from a napalm attack shouldn’t be on the site, you should tell Facebook that, too.

In the wake of the 2016 election, Facebook has leaned even harder in this direction: It outsourced the detection of fake news to third-party fact-checkers (who have since complained that Facebook wasn’t serious about the work). And it asked readers to tell it what news sites are trustworthy. Now it wants an independent Facebook Court to rule on controversial content decisions.

Facebook is also spending billions on software and humans to help police its own property. (Casey Newton argues persuasively that Facebook should be spending much more on the humans it employs to look at some of the ghastly things people upload to the site.)

But Facebook’s fundamental positioning of itself as neutral ground, where people happen to show up and do things (as opposed to software that’s specifically designed to entice people to show up and do things) means that it’s always going to ask outsiders — users, copyright owners, regulators — to help keep it in line.

This article originally appeared on Recode.net.

More in Technology

Technology
The case for AI realismThe case for AI realism
Technology

AI isn’t going to be the end of the world — no matter what this documentary sometimes argues.

By Shayna Korol
Politics
OpenAI’s oddly socialist, wildly hypocritical new economic agendaOpenAI’s oddly socialist, wildly hypocritical new economic agenda
Politics

The AI company released a set of highly progressive policy ideas. There’s just one small problem.

By Eric Levitz
Future Perfect
Human bodies aren’t ready to travel to Mars. Space medicine can help.Human bodies aren’t ready to travel to Mars. Space medicine can help.
Future Perfect

Protecting astronauts in space — and maybe even Mars — will help transform health on Earth.

By Shayna Korol
Podcasts
The importance of space toilets, explainedThe importance of space toilets, explained
Podcast
Podcasts

Houston, we have a plumbing problem.

By Peter Balonon-Rosen and Sean Rameswaram
Technology
What happened when they installed ChatGPT on a nuclear supercomputerWhat happened when they installed ChatGPT on a nuclear supercomputer
Technology

How they’re using AI at the lab that created the atom bomb.

By Joshua Keating
Future Perfect
Humanity’s return to the moon is a deeply religious missionHumanity’s return to the moon is a deeply religious mission
Future Perfect

Space barons like Jeff Bezos and Elon Musk don’t seem religious. But their quest to colonize outer space is.

By Sigal Samuel