Skip to main content

The context you need, when you need it

When news breaks, you need to understand what actually matters — and what to do about it. At Vox, our mission to help you make sense of the world has never been more vital. But we can’t do it on our own.

We rely on readers like you to fund our journalism. Will you support our work and become a Vox Member today?

Join now

Facebook now plans to let users decide which news sources are the most “trustworthy”

What could possibly go wrong?

Aja Romano
Aja Romano wrote about pop culture, media, and ethics. Before joining Vox in 2016, they were a staff reporter at the Daily Dot. A 2019 fellow of the National Critics Institute, they’re considered an authority on fandom, the internet, and the culture wars.

Facebook continues to roll out changes in the wake of CEO Mark Zuckerberg’s bold promise to “fix” the platform in 2018. Last week, the company announced sweeping changes to its News Feed, vowing to show users more content shared by their family and friends, and to prioritize said content over posts by media outlets and brands.

But based on Facebook’s latest announcement, published January 19 to the company’s official blog with an accompanying explanation on Zuckerberg’s personal Facebook page, the platform’s News Feed overhaul has just gotten more complicated — and, many people are arguing, more fraught.

Related

Facebook’s January 11 announcement about the changes it’s making to the News Feed explained that “showing more posts from friends and family and updates that spark conversation means we’ll show less public content, including videos and other posts from publishers or businesses.” Many people reasonably interpreted that statement to mean they would see less news shared by publishers in their feed.

And that interpretation still appears to be true. But as Zuckerberg has now revealed in his January 19 post, the news that people do see from publishers will also come from a more limited number of sources. More specifically, it will come from “trusted” media sources that have been vetted by Facebook’s community of users.

“There’s too much sensationalism, misinformation and polarization in the world today,” Zuckerberg writes. “That’s why it’s important that News Feed promotes high quality news that helps build a sense of common ground.”

“This update will not change the amount of news you see on Facebook,” he continued. “It will only shift the balance of news you see towards sources that are determined to be trusted by the community.”

The official Facebook announcement specifies, again, that “people will see less public content, including news, video and posts from brands.” But the expansion of this mission “to make sure the news people see, while less overall, is high quality” is where things get tricky.

Because how will Facebook decide what news deserves the label of “high quality”? That’s where you (and all your fellow Facebook users) come in.

Facebook wants its users to decide which media outlets they trust most. That’s a seriously risky move.

In its January 19 announcement, Facebook said it intends to implement widespread surveys to help it filter media sources using the following criteria:

* News from publications that the community rates as trustworthy

* News that people find informative

* News that is relevant to people’s local community

According to the site, it’s already conducted a test survey among “a diverse and representative sample of people using Facebook across the US,” and thus formed an initial ranking of media sources that its users consider to be trustworthy.

Starting Monday, the company will roll out more surveys to more users, asking them to share their thoughts on various media sources. Posts shared by media outlets may appear more or less frequently in people’s News Feeds depending on how trustworthy Facebook’s users perceive those outlets to be. Or, as Facebook put it:

For the first change in the US next week, publications deemed trustworthy by people using Facebook may see an increase in their distribution. Publications that do not score highly as trusted by the community may see a decrease.

Letting Facebook users essentially decide which news outlets are reliable enough to be worthy of “distribution” on the site is a seriously risky endeavor. After all, it’s safe to assume that some of these users are the same folks responsible for routinely making fake news spread more virally across the site than real news.

Or, as Gizmodo’s Bryan Menegus exclaimed in reaction to the news:

HOLY FUCK, MARK. If people cannot tell truth from bullshit, why are those same people being used to rank publications on a scale of trustworthiness?

A change of some sort seems necessary in the wake of the role Facebook played in distributing “fake news” — as well as political ads paid for by Russian troll farms — prior to the 2016 presidential election, not to mention Congress’s subsequent call for the site to answer for its actions. But so far, the reaction to Facebook’s plan has been confusion mixed with levels of uncertainty ranging from vague wariness to outright distrust:

Right now it’s too early to say exacctly how the News Feed changes will affect the overall and individual Facebook experience.

But as a reminder, you can filter your News Feed by going to your Facebook home page and selecting “Most Recent” instead of “Top” from the dropdown bar next to the News Feed, in order to view posts chronologically rather than in an order that’s been algorithmically sorted for you.

And you can choose to view your “Pages Feed” to see news updates from any media outlet or brand that you’ve chosen to follow.

These aren’t foolproof remedies for avoiding fake and biased sources on Facebook — but as Facebook continues to make changes to its News Feed, they may become increasingly important. Because as Facebook is now making increasingly clear, we’re all, ultimately, responsible for vetting our own media consumption, even on social media.

More in Technology

Technology
The case for AI realismThe case for AI realism
Technology

AI isn’t going to be the end of the world — no matter what this documentary sometimes argues.

By Shayna Korol
Politics
OpenAI’s oddly socialist, wildly hypocritical new economic agendaOpenAI’s oddly socialist, wildly hypocritical new economic agenda
Politics

The AI company released a set of highly progressive policy ideas. There’s just one small problem.

By Eric Levitz
Future Perfect
Human bodies aren’t ready to travel to Mars. Space medicine can help.Human bodies aren’t ready to travel to Mars. Space medicine can help.
Future Perfect

Protecting astronauts in space — and maybe even Mars — will help transform health on Earth.

By Shayna Korol
Podcasts
The importance of space toilets, explainedThe importance of space toilets, explained
Podcast
Podcasts

Houston, we have a plumbing problem.

By Peter Balonon-Rosen and Sean Rameswaram
Technology
What happened when they installed ChatGPT on a nuclear supercomputerWhat happened when they installed ChatGPT on a nuclear supercomputer
Technology

How they’re using AI at the lab that created the atom bomb.

By Joshua Keating
Future Perfect
Humanity’s return to the moon is a deeply religious missionHumanity’s return to the moon is a deeply religious mission
Future Perfect

Space barons like Jeff Bezos and Elon Musk don’t seem religious. But their quest to colonize outer space is.

By Sigal Samuel