Skip to main content

The context you need, when you need it

When news breaks, you need to understand what actually matters — and what to do about it. At Vox, our mission to help you make sense of the world has never been more vital. But we can’t do it on our own.

We rely on readers like you to fund our journalism. Will you support our work and become a Vox Member today?

Join now

Facebook is going to start ranking news sources — once its users tell it how to rank news sources

Mark Zuckerberg says he wants better news on his social network, but he doesn’t want to figure it out for himself.

Facebook CEO Mark Zuckerberg onstage at Facebook’s F8 Conference
Facebook CEO Mark Zuckerberg onstage at Facebook’s F8 Conference
Facebook CEO Mark Zuckerberg
Sullivan / Getty Images
Peter Kafka
Peter Kafka covered media and technology, and their intersection, at Vox. Many of his stories can be found in his Kafka on Media newsletter, and he also hosts the Recode Media podcast.

Facebook is doing a very un-Facebooky thing: It’s going to start declaring that some news sources you see in your Facebook feed are better than others, and act accordingly.

But Facebook being Facebook, it’s going about it in the most Facebooky way possible: It’s going to rely on users — not the super-smart people who work at Facebook — to figure out which of those sources are better.

Mark Zuckerberg says the move is part of an effort to prioritize “news that is trustworthy, informative, and local” within the network, and suggests that there will be more announcements to come.

The one he describes today will prioritize what kind of news sources pop up in your Facebook News Feed, and will reward ones that Facebook thinks are “broadly trusted,” based on user polls, so it can “build a sense of common ground.”

It’s a reminder that despite Facebook’s announcement last week that it is going to distribute less news to its users, it is still going to distribute plenty of news, given its enormous power and weight.

Facebook is also using today’s news to refine last week’s roll-out: Zuckerberg says the previously announced changes will reduce the amount of news stories people see in their feed to 4 percent, down from 5 percent.

But what’s most telling about today’s announcement is that Facebook remains insistent on arguing that it’s not really in a position to make judgements about the stuff it shows its two billion users — someone else needs to do it.

“The hard question we’ve struggled with is how to decide what news sources are broadly trusted in a world with so much division,” Zuckerberg writes in a blog post today.

But to be clear, it’s not because the people who work at Facebook, who build their own internet-beaming drones, aren’t smart enough to figure out the difference between the New York Times and the Denver Guardian, which doesn’t actually exist. It’s that they don’t want to do it, Zuckerberg says. ”We could try to make that decision ourselves, but that’s not something we’re comfortable with.”

So instead, there’s this: “As part of our ongoing quality surveys, we will now ask people whether they’re familiar with a news source and, if so, whether they trust that source. The idea is that some news organizations are only trusted by their readers or watchers, and others are broadly trusted across society even by those who don’t follow them directly. (We eliminate from the sample those who aren’t familiar with a source, so the output is a ratio of those who trust the source to those who are familiar with it.)”

Sounds ... okay? Particularly if you like reading established news sources.

Not so good if you’d like to expand your news diet beyond that. And not so good if you’re a new publisher trying to make your way in a media universe dominated by two giant digital companies.

Do you have more questions about this and Facebook’s other efforts to clean up its News Feed? So do we. So we’ll be asking Facebook execs Adam Mosseri and Campbell Brown about it next month at our Code Media conference. You can join us there.


This article originally appeared on Recode.net.

More in Technology

Podcasts
Anthropic just made AI scarierAnthropic just made AI scarier
Podcast
Podcasts

Why the company’s new AI model is a cybersecurity nightmare.

By Dustin DeSoto and Sean Rameswaram
Politics
The Supreme Court will decide when the police can use your phone to track youThe Supreme Court will decide when the police can use your phone to track you
Politics

Chatrie v. United States asks what limits the Constitution places on the surveillance state in an age of cellphones.

By Ian Millhiser
Future Perfect
The simple question that could change your careerThe simple question that could change your career
Future Perfect

Making a difference in the world doesn’t require changing your job.

By Bryan Walsh
Technology
The case for AI realismThe case for AI realism
Technology

AI isn’t going to be the end of the world — no matter what this documentary sometimes argues.

By Shayna Korol
Politics
OpenAI’s oddly socialist, wildly hypocritical new economic agendaOpenAI’s oddly socialist, wildly hypocritical new economic agenda
Politics

The AI company released a set of highly progressive policy ideas. There’s just one small problem.

By Eric Levitz
Future Perfect
Human bodies aren’t ready to travel to Mars. Space medicine can help.Human bodies aren’t ready to travel to Mars. Space medicine can help.
Future Perfect

Protecting astronauts in space — and maybe even Mars — will help transform health on Earth.

By Shayna Korol