Skip to main content

The context you need, when you need it

When news breaks, you need to understand what actually matters — and what to do about it. At Vox, our mission to help you make sense of the world has never been more vital. But we can’t do it on our own.

We rely on readers like you to fund our journalism. Will you support our work and become a Vox Member today?

Join now

Mark Zuckerberg says it’s ‘extremely unlikely’ fake news on Facebook changed the election outcome

So what responsibility does the social network have as a media purveyor?

President Obama Speaks At The Global Entrepreneurship Summit
President Obama Speaks At The Global Entrepreneurship Summit
Is Facebook CEO Mark Zuckerberg editor in chief or not?
Justin Sullivan / Getty Images

In a post on Facebook, founder and CEO Mark Zuckerberg said that fake stories on the powerful social network did not change the recent presidential election results.

Noting that most of Facebook content is “authentic” — apparently 99 percent, although he offers no proof of how he got to this number — and that fake news and hoaxes are not limited to one partisan view, Zuckerberg wrote: “Overall, this makes it extremely unlikely hoaxes changed the outcome of this election in one direction or the other.”

Zuckerberg made a similar point in a recent onstage interview, saying it was a “pretty crazy idea” that Facebook changed the election. Instead, there and in this post, he has been relying on what has become an ever-weaker but go-to excuse for Facebook — that it is just a platform and not a media organization, providing a place where people of differing opinions can coexist and without any responsibility to try to get it right.

But others argue, including within the company itself, that numerous fakes stories that pop up on the service have real impact and Facebook needs to fix that. A report in the New York Times noted that there has been an intense debate within the company over the issue.

“Some employees are worried about the spread of racist and so-called alt-right memes across the network, according to interviews with 10 current and former Facebook employees,” said the New York Times. “Others are asking whether they contributed to a ‘filter bubble’ among users who largely interact with people who share the same beliefs.”

There is no question that Zuckerberg’s post is a step —though only a first one, I suspect — in sorting this out, especially given recent studies that show Facebook has become one of the major outlets where Americans get their news.

In other words, when does a platform have an editorial responsibility to stop false information? The answer is very hard to arrive at, as Zuckerberg acknowledged.

“Identifying the ‘truth’ is complicated. While some hoaxes can be completely debunked, a greater amount of content, including from mainstream sources, often gets the basic idea right but some details wrong or omitted,” he wrote. “An even greater volume of stories express an opinion that many will disagree with and flag as incorrect even when factual.”

Here is the whole post:

I want to share some thoughts on Facebook and the election.

Our goal is to give every person a voice. We believe deeply in people. Assuming that people understand what is important in their lives and that they can express those views has driven not only our community, but democracy overall. Sometimes when people use their voice though, they say things that seem wrong and they support people you disagree with.

After the election, many people are asking whether fake news contributed to the result, and what our responsibility is to prevent fake news from spreading. These are very important questions and I care deeply about getting them right. I want to do my best to explain what we know here.

Of all the content on Facebook, more than 99% of what people see is authentic. Only a very small amount is fake news and hoaxes. The hoaxes that do exist are not limited to one partisan view, or even to politics. Overall, this makes it extremely unlikely hoaxes changed the outcome of this election in one direction or the other.

That said, we don’t want any hoaxes on Facebook. Our goal is to show people the content they will find most meaningful, and people want accurate news. We have already launched work enabling our community to flag hoaxes and fake news, and there is more we can do here. We have made progress, and we will continue to work on this to improve further.

This is an area where I believe we must proceed very carefully though. Identifying the “truth” is complicated. While some hoaxes can be completely debunked, a greater amount of content, including from mainstream sources, often gets the basic idea right but some details wrong or omitted. An even greater volume of stories express an opinion that many will disagree with and flag as incorrect even when factual. I am confident we can find ways for our community to tell us what content is most meaningful, but I believe we must be extremely cautious about becoming arbiters of truth ourselves.

As we continue our research, we are committed to always updating you on how News Feed evolves. We hope to have more to share soon, although this work often takes longer than we’d like in order to confirm changes we make won’t introduce unintended side effects or bias into the system. If you’re interested in following our updates, I encourage you to follow our News Feed FYI here: http://bit.ly/2frNWo2.

Overall, I am proud of our role giving people a voice in this election. We helped more than 2 million people register to vote, and based on our estimates we got a similar number of people to vote who might have stayed home otherwise. We helped millions of people connect with candidates so they could hear from them directly and be better informed. Most importantly, we gave tens of millions of people tools to share billions of posts and reactions about this election. A lot of that dialog may not have happened without Facebook.

This has been a historic election and it has been very painful for many people. Still, I think it’s important to try to understand the perspective of people on the other side. In my experience, people are good, and even if you may not feel that way today, believing in people leads to better results over the long term.

This article originally appeared on Recode.net.

More in Technology

Technology
The case for AI realismThe case for AI realism
Technology

AI isn’t going to be the end of the world — no matter what this documentary sometimes argues.

By Shayna Korol
Politics
OpenAI’s oddly socialist, wildly hypocritical new economic agendaOpenAI’s oddly socialist, wildly hypocritical new economic agenda
Politics

The AI company released a set of highly progressive policy ideas. There’s just one small problem.

By Eric Levitz
Future Perfect
Human bodies aren’t ready to travel to Mars. Space medicine can help.Human bodies aren’t ready to travel to Mars. Space medicine can help.
Future Perfect

Protecting astronauts in space — and maybe even Mars — will help transform health on Earth.

By Shayna Korol
Podcasts
The importance of space toilets, explainedThe importance of space toilets, explained
Podcast
Podcasts

Houston, we have a plumbing problem.

By Peter Balonon-Rosen and Sean Rameswaram
Technology
What happened when they installed ChatGPT on a nuclear supercomputerWhat happened when they installed ChatGPT on a nuclear supercomputer
Technology

How they’re using AI at the lab that created the atom bomb.

By Joshua Keating
Future Perfect
Humanity’s return to the moon is a deeply religious missionHumanity’s return to the moon is a deeply religious mission
Future Perfect

Space barons like Jeff Bezos and Elon Musk don’t seem religious. But their quest to colonize outer space is.

By Sigal Samuel