Skip to main content

The context you need, when you need it

When news breaks, you need to understand what actually matters — and what to do about it. At Vox, our mission to help you make sense of the world has never been more vital. But we can’t do it on our own.

We rely on readers like you to fund our journalism. Will you support our work and become a Vox Member today?

Join now

Facebook says it will use image recognition software to fight revenge porn

The move comes a few months after it was reported that a Facebook group was being used to share revenge porn of women in the military.

Facebook CEO Mark Zuckerberg.
Facebook CEO Mark Zuckerberg.
Facebook

Facebook is using its image recognition software to keep users from sharing revenge porn to its different services, including Facebook, Instagram and Messenger, according to a post by CEO Mark Zuckerberg Wednesday morning.

“It’s wrong, it’s hurtful, and if you report it to us, we will now use AI and image recognition to prevent it from being shared,” he added.

Revenge porn is an inappropriate image or video shared online, usually by a former spouse or partner, with the intent of harassing and embarrassing someone.

Facebook’s plan here is slightly vague, but it sounds like the company will create a database of images that its algorithms can memorize and remove automatically from its different apps. Tech companies do something similar to fight the spread of child pornography. We’ve asked Facebook for clarity and will update once we hear back.*

Revenge porn is an issue in lots of corners of the internet, but the move on Facebook’s part comes just a few months after it was reported that hundreds of U.S. Marines were using a Facebook group to share photos of fellow service members. That group was shut down, but it has since moved to Snapchat, according to BuzzFeed.

* Update: In turns out Facebook shared more about these efforts in a blog post. According to the post, users can report an inappropriate image, which will be reviewed by a human on Facebook’s community operations team. If the image violates Facebook’s community standards, the company will use “photo-matching technologies” to block people from sharing that same image in the future.


This article originally appeared on Recode.net.

More in Technology

Technology
The case for AI realismThe case for AI realism
Technology

AI isn’t going to be the end of the world — no matter what this documentary sometimes argues.

By Shayna Korol
Politics
OpenAI’s oddly socialist, wildly hypocritical new economic agendaOpenAI’s oddly socialist, wildly hypocritical new economic agenda
Politics

The AI company released a set of highly progressive policy ideas. There’s just one small problem.

By Eric Levitz
Future Perfect
Human bodies aren’t ready to travel to Mars. Space medicine can help.Human bodies aren’t ready to travel to Mars. Space medicine can help.
Future Perfect

Protecting astronauts in space — and maybe even Mars — will help transform health on Earth.

By Shayna Korol
Podcasts
The importance of space toilets, explainedThe importance of space toilets, explained
Podcast
Podcasts

Houston, we have a plumbing problem.

By Peter Balonon-Rosen and Sean Rameswaram
Technology
What happened when they installed ChatGPT on a nuclear supercomputerWhat happened when they installed ChatGPT on a nuclear supercomputer
Technology

How they’re using AI at the lab that created the atom bomb.

By Joshua Keating
Future Perfect
Humanity’s return to the moon is a deeply religious missionHumanity’s return to the moon is a deeply religious mission
Future Perfect

Space barons like Jeff Bezos and Elon Musk don’t seem religious. But their quest to colonize outer space is.

By Sigal Samuel