Skip to main content

The context you need, when you need it

When news breaks, you need to understand what actually matters — and what to do about it. At Vox, our mission to help you make sense of the world has never been more vital. But we can’t do it on our own.

We rely on readers like you to fund our journalism. Will you support our work and become a Vox Member today?

Join now

Facebook Will Stop Playing With Your Emotions (Sort Of)

The social network changed its research guidelines and established a review board in the wake of a much-maligned study.

Facebook updated its research guidelines Thursday, a policy change that comes three months after news surfaced that Facebook had purposely tried to manipulate user emotions in a 2012 study. The result for Facebook was a slew of pissed-off users, and the company hopes these changes will prevent similar issues in the future.

Facebook is always testing something in its attempt to improve the service, and Thursday’s update confirms the company will continue doing research. Facebook has lots of user data (lots and lots and lots), and it uses that information to change products, like Messenger or News Feed, and experiences, like which posts/ads you might see.

So what will Facebook do differently moving forward?

The company has essentially added a new level of checks and balances. Before, when groups within Facebook conducted research, the research plans were approved by those group leaders. For example, News Feed research was discussed and approved within the News Feed team, according to a spokesperson.

Now, Facebook has added a more expansive internal review panel that includes senior members from teams across Facebook. If a particular group plans to conduct research on sensitive topics, for example a specific groups of users (women of a particular age) or research relating to “content that may be considered deeply personal (such as emotions),” this research must be reviewed by the panel.

Facebook isn’t sharing names of individuals on the panel, but heads of different areas — communications, marketing, policy, etc. — will be on it, according to a spokesperson. No outside academics will sit on the panel, but Facebook consulted a number of professional researchers while assembling the group, the spokesperson added.

The idea is that opening the review and approval process to a wider range of employees will keep internal teams from moving forward with research that may not align with Facebook’s goals (at least the goals they want users to understand).

Facebook also says that research practices have been added to its six-week training program for new engineering hires beginning this week, and all employees will learn about company research practices during the annual privacy and security training that Facebook requires.

In regard to the manipulation study discovered in June, Facebook has apologized in the past for failing to communicate better with users about the research, and it reiterated that in a blog on Thursday. “We should have considered other non-experimental ways to do this research,” wrote Mike Schroepfer, Facebook’s CTO. “The research would also have benefited from more extensive review by a wider and more senior group of people.”

That second part — the more extensive review — is what Facebook’s new policy should fix. The company has also added a new website where all published academic research will live in the future.

This article originally appeared on Recode.net.

More in Technology

Podcasts
Anthropic just made AI scarierAnthropic just made AI scarier
Podcast
Podcasts

Why the company’s new AI model is a cybersecurity nightmare.

By Dustin DeSoto and Sean Rameswaram
Politics
The Supreme Court will decide when the police can use your phone to track youThe Supreme Court will decide when the police can use your phone to track you
Politics

Chatrie v. United States asks what limits the Constitution places on the surveillance state in an age of cellphones.

By Ian Millhiser
Future Perfect
The simple question that could change your careerThe simple question that could change your career
Future Perfect

Making a difference in the world doesn’t require changing your job.

By Bryan Walsh
Technology
The case for AI realismThe case for AI realism
Technology

AI isn’t going to be the end of the world — no matter what this documentary sometimes argues.

By Shayna Korol
Politics
OpenAI’s oddly socialist, wildly hypocritical new economic agendaOpenAI’s oddly socialist, wildly hypocritical new economic agenda
Politics

The AI company released a set of highly progressive policy ideas. There’s just one small problem.

By Eric Levitz
Future Perfect
Human bodies aren’t ready to travel to Mars. Space medicine can help.Human bodies aren’t ready to travel to Mars. Space medicine can help.
Future Perfect

Protecting astronauts in space — and maybe even Mars — will help transform health on Earth.

By Shayna Korol