Skip to main content

The context you need, when you need it

When news breaks, you need to understand what actually matters — and what to do about it. At Vox, our mission to help you make sense of the world has never been more vital. But we can’t do it on our own.

We rely on readers like you to fund our journalism. Will you support our work and become a Vox Member today?

Join now

Facebook is trying yet again to cut clickbait headlines from your News Feed

But this update may not hurt publishers as much as before.

Facebook

Facebook is tweaking its News Feed algorithm to fight a familiar foe: Clickbait, or articles that Facebook says “withhold information intentionally” or “mislead people, forcing them to click to find out the answer.”

If this sounds familiar, it should. This is the third time Facebook has tweaked its algorithm since late 2014 with the sole purpose of fighting clickbait stories.

What’s different this time around? Facebook says it’s getting more specific about how to minimize the reach of a clickbait post without necessarily hurting a publisher’s other content.

In an update from August, Facebook started punishing publisher Pages that routinely used clickbait headlines. Now it says it can detect clickbait on an individual story level, meaning it will suppress a specific article from gaining traction in News Feed, but won’t necessarily punish the publisher’s other posts.

It’s also going to start categorizing clickbait headlines into two categories: Those that “withhold information” and those that “exaggerate information.” Neither offense is considered worse than the other, but headlines that fall into both categories will have an even tougher time gaining traction in News Feed.

Facebook’s official stance here is that clickbait headlines run counter to its efforts to create an “informed community,” which was a pillar of CEO Mark Zuckerberg’s big manifesto back in February.

But a simpler way to think of this is that Facebook is admitting that clickbait stories do more harm than good among users, even though some publishers obviously benefit from them. It’s the same reason Facebook is penalizing publishers that drive people to sites full of ads, or publishers that post false news.

Facebook is still reeling from the aftereffects of last fall’s presidential election, when false news reports ran rampant in News Feed thanks in part to coordinated efforts meant to hurt specific candidates. Cracking down on false news, but also extreme headlines that encourage partisan sharing without any substance to back them up, has been a major company focus ever since.


This article originally appeared on Recode.net.

More in Technology

Podcasts
Anthropic just made AI scarierAnthropic just made AI scarier
Podcast
Podcasts

Why the company’s new AI model is a cybersecurity nightmare.

By Dustin DeSoto and Sean Rameswaram
Politics
The Supreme Court will decide when the police can use your phone to track youThe Supreme Court will decide when the police can use your phone to track you
Politics

Chatrie v. United States asks what limits the Constitution places on the surveillance state in an age of cellphones.

By Ian Millhiser
Future Perfect
The simple question that could change your careerThe simple question that could change your career
Future Perfect

Making a difference in the world doesn’t require changing your job.

By Bryan Walsh
Technology
The case for AI realismThe case for AI realism
Technology

AI isn’t going to be the end of the world — no matter what this documentary sometimes argues.

By Shayna Korol
Politics
OpenAI’s oddly socialist, wildly hypocritical new economic agendaOpenAI’s oddly socialist, wildly hypocritical new economic agenda
Politics

The AI company released a set of highly progressive policy ideas. There’s just one small problem.

By Eric Levitz
Future Perfect
Human bodies aren’t ready to travel to Mars. Space medicine can help.Human bodies aren’t ready to travel to Mars. Space medicine can help.
Future Perfect

Protecting astronauts in space — and maybe even Mars — will help transform health on Earth.

By Shayna Korol