Skip to main content

The context you need, when you need it

When news breaks, you need to understand what actually matters — and what to do about it. At Vox, our mission to help you make sense of the world has never been more vital. But we can’t do it on our own.

We rely on readers like you to fund our journalism. Will you support our work and become a Vox Member today?

Join now

Mark Zuckerberg on leaked audio: Trump’s looting and shooting reference “has no history of being read as a dog whistle”

On a tense call with employees, the Facebook CEO defended his decision not to moderate Trump’s posts.

Facebook co-founder and CEO Mark Zuckerberg testified before the House Financial Services Committee in October 2019.
Facebook co-founder and CEO Mark Zuckerberg testified before the House Financial Services Committee in October 2019.
Facebook co-founder and CEO Mark Zuckerberg testified before the House Financial Services Committee in October 2019.
Chip Somodevilla/Getty Images
Shirin Ghaffary
Shirin Ghaffary was a senior Vox correspondent covering the social media industry. Previously, Ghaffary worked at BuzzFeed News, the San Francisco Chronicle, and TechCrunch.

In an internal video call with Facebook employees on Tuesday obtained by Recode, CEO Mark Zuckerberg doubled down on his controversial decision to take no action on a post last week from President Donald Trump. In the post, Trump referred to the ongoing protests in the US against racism and police brutality and said, “when the looting starts, the shooting starts.”

Facebook’s handling of Trump’s post — which included language similar to what segregationists used when referring to black protesters in the civil rights era — has divided employees at Facebook and prompted them to openly criticize Zuckerberg in a way they never have before. Around 400 employees staged a virtual walkout of work on Monday, at least two employees have resigned in protest, others have threatened to resign, and several senior-level managers have publicly disagreed with Zuckerberg’s stance — calling for him to take down or otherwise moderate Trump’s post, as Facebook’s competitor Twitter already has.

This tension spilled over into the Tuesday Q&A meeting that around 25,000 employees tuned into — with several employees’ posing questions that were highly critical of the company’s actions and policies, and scrutinized whether the company is listening to racially diverse voices in its upper ranks. (Read the transcript of Zuckerberg’s meeting with Facebook employees here.)

“I knew that the stakes were very high on this, and knew a lot of people would be upset if we made the decision to leave it up,” Zuckerberg said on the call. He went on to say that after reviewing the implications of Trump’s statement, he decided that “the right action for where we are right now is to leave this up.”

Zuckerberg said that he did a thorough analysis of the history around the apparent reference in Trump’s post, which he called “troubling,” but ultimately did not find it to be an incitement of violence under Facebook’s policies.

“We basically concluded after the research and after everything I’ve read and all the different folks that I’ve talked to that the reference is clearly to aggressive policing — maybe excessive policing — but it has no history of being read as a dog whistle for vigilante supporters to take justice into their own hands,” Zuckerberg said on the call. He also said that, overall, Facebook still reserves the right to moderate Trump.

“This isn’t a case where [Trump] is allowed to say anything he wants, or that we let government officials or policy makers say anything they want.”

Do you work at Amazon and have thoughts on what’s going on? Please email Shirin Ghaffary at shirin.ghaffary@protonmail.com or Jason Del Rey at jasondelrey@protonmail.com to reach them confidentially. Signal numbers available upon request by email.

Facebook has largely avoided moderating Trump’s posts on its platform. In March, however, after Recode and other outlets reported on deceptive advertisements that made a Trump campaign questionnaire appear to be the official 2020 census, Facebook removed these ads from its platform.

After opening the call, Zuckerberg went on to take questions from a preselected list, with employees asking questions via videoconference. In one of several tense exchanges, an employee asked Zuckerberg to confirm how many black people were involved in Zuckerberg’s final decision not to take down Trump’s post. Zuckerberg’s answer: just one person (Facebook’s global diversity officer, Maxine Williams). Zuckerberg said only a small group of people were involved in the decision-making process, including Facebook COO Sheryl Sandberg and policy VP Joel Kaplan, who has come under scrutiny for reportedly stymieing efforts to reduce polarization on the platform and openly supporting Supreme Court Justice Brett Kavanaugh during his controversial Senate hearings.

The employee pressed why Facebook’s head of integrity, Guy Rosen, who is tasked with overseeing efforts around general user safety on the platform, wasn’t in the final group of decision-makers.

In response, Zuckerberg appeared to stumble with his reasoning, first saying that Rosen was present — but then saying he actually wasn’t sure if Rosen was a part of the final decision. Ultimately, Zuckerberg said Rosen is responsible for building and enforcing policies overall, but not this particular decision.

“I don’t think it’s great that we’re not super clear on whether the VP of integrity was included on a matter of voter suppression and societal violence,” the employee said on the videoconference.

“How can we trust Facebook leadership if you show us a lack of transparency?” asked another employee.

When asked about the criticism Facebook faced in the meeting, a spokesperson for the company sent Recode the following statement: “Open and honest discussion has always been a part of Facebook’s culture. Mark had an open discussion with employees today, as he has regularly over the years. He’s grateful for their feedback.”

In an acknowledgement of employees’ anger over the situation, Zuckerberg outlined several areas of self-designated improvement for Facebook on the call, including being more transparent around the decision-making process for moderating contentious posts.

Most notably, Zuckerberg said the company is considering adding labels to posts by world leaders that incite violence, instead of simply leaving them up or taking them down. He also said that since the US may be entering a “prolonged period of civil unrest,” they may change their policy on what kind of announcements government leaders can make about state violence, such as excessive use of police force.

While Zuckerberg was at times conciliatory, he was defensive of Facebook’s stance not to make what he views as knee-jerk decisions against content that people could find personally offensive. He said even if they do change their policies around moderating potentially violent political speech like Trump’s post, it would not happen overnight.

“These policies have to be developed,” Zuckerberg said. There’s “no way we can do something like that on the fly.”

That brings up the question of why Facebook isn’t more prepared to moderate political speech that pushes the boundaries of its platform’s rules on inciting and glorifying violence. Since the 2016 US presidential election, how Facebook moderates content has come under fire, and the company has promised to do better. Its long-awaited independent oversight board meant to review controversial cases like Trump’s post still hasn’t officially launched; meanwhile, the next US presidential election is less than six months away.

On the call, Zuckerberg acknowledged that this is only the beginning of employee discussion on the company’s handling of the very real controversies coming its way around race, politics, and police violence.

“I know we’re going to keep talking about this, some of the issues, they’re deep and they’re not going to go away any time soon,” Zuckerberg said.

More in Technology

Technology
The case for AI realismThe case for AI realism
Technology

AI isn’t going to be the end of the world — no matter what this documentary sometimes argues.

By Shayna Korol
Politics
OpenAI’s oddly socialist, wildly hypocritical new economic agendaOpenAI’s oddly socialist, wildly hypocritical new economic agenda
Politics

The AI company released a set of highly progressive policy ideas. There’s just one small problem.

By Eric Levitz
Future Perfect
Human bodies aren’t ready to travel to Mars. Space medicine can help.Human bodies aren’t ready to travel to Mars. Space medicine can help.
Future Perfect

Protecting astronauts in space — and maybe even Mars — will help transform health on Earth.

By Shayna Korol
Podcasts
The importance of space toilets, explainedThe importance of space toilets, explained
Podcast
Podcasts

Houston, we have a plumbing problem.

By Peter Balonon-Rosen and Sean Rameswaram
Technology
What happened when they installed ChatGPT on a nuclear supercomputerWhat happened when they installed ChatGPT on a nuclear supercomputer
Technology

How they’re using AI at the lab that created the atom bomb.

By Joshua Keating
Future Perfect
Humanity’s return to the moon is a deeply religious missionHumanity’s return to the moon is a deeply religious mission
Future Perfect

Space barons like Jeff Bezos and Elon Musk don’t seem religious. But their quest to colonize outer space is.

By Sigal Samuel