Skip to main content

The context you need, when you need it

When news breaks, you need to understand what actually matters — and what to do about it. At Vox, our mission to help you make sense of the world has never been more vital. But we can’t do it on our own.

We rely on readers like you to fund our journalism. Will you support our work and become a Vox Member today?

Join now

YouTube CEO Susan Wojcicki says vetting videos before they go up isn’t the right answer

“I think we would lose a lot of voices,” Wojcicki said.

Rani Molla
Rani Molla was a senior correspondent at Vox and has been focusing her reporting on the future of work. She has covered business and technology for more than a decade — often in charts — including at Bloomberg and the Wall Street Journal.

YouTube CEO Susan Wojcicki is okay with taking content down, but she doesn’t think it’s a good idea to review it before it goes up on the massive video-sharing platform.

That’s one big takeaway from her interview with Recode senior correspondent Peter Kafka at this year’s Code Conference.

“I think we would lose a lot of voices,” Wojcicki said. “I don’t think that’s the right answer.”

She also warned that it could be difficult to come up with criteria as to what could be uploaded in the first place: “What are the factors that you’re [using to] determine that? How are you deciding who is getting to be on the platform and have speech and who’s not?”

When Kafka pointed out the company is already making such decisions — but only after content is online on YouTube’s platform — Wojcicki emphasized the importance of reviewing content after it publishes on the site. “We see all these benefits of openness, but we also see that that needs to be married with responsibility,” she said.

The YouTube CEO admitted that there will likely always be content on YouTube that violates its policies.

“At the scale that we’re at, there are always gonna be people who want to write stories,” she said, suggesting that journalists will always choose to focus on the negative aspects of YouTube in their reporting.

“We have lots of content that’s uploaded and lots of users and lots of really good content. When we look at it, what all the news and the concerns and stories have been about is this fractional 1 percent,” Wojcicki said. “If you talk about what the other 99 point-whatever-that-number is — that’s all really valuable content.”

“Yes, while there may be something that slips through or some issue, we’re really working hard to address this,” she said.

Last week, YouTube updated its hate speech policy and said it will take down “videos alleging that a group is superior in order to justify discrimination, segregation or exclusion based on qualities like age, gender, race, caste, religion, sexual orientation or veteran status.” The policy directly mentioned removing videos that promote neo-Nazi content or videos that deny commonly accepted violent events, like the Holocaust or the Sandy Hook school shooting — but lots of other conspiracy theories and “borderline content” are still allowed on the platform.

Instead of approving videos ahead of time, Wojcicki suggested using tiers in which creators get certain privileges over time, like more distribution and monetization of their content.

“I think this idea of like not everything is automatically given to you on day one, that it’s more of a — we have trusted tiers,” she said.

In recent weeks, the company has confronted numerous issues. Last week, the video platform decided that YouTube creator Steven Crowder wasn’t violating its rules when he kept posting videos with homophobic slurs directed at Vox journalist Carlos Maza, though the company eventually demonetized Crowder’s channel.

YouTube has said that by limiting recommendations, comments, and sharing, it has reduced views of white supremacist videos by 80 percent since 2017. It’s only now banned that content altogether. The company is one of several prominent tech companies trying to figure out how to deal with hateful content proliferating on their platforms. Facebook banned white supremacist content on Facebook and Instagram in March. Twitter says it is looking into it. But even when these companies do make rules prohibiting harmful content, the sheer volume of uploads and posts on their platforms make it difficult to exclude content that breaks those rules. YouTube’s army of content creators uploads 500 hours of video each minute of every day on its site.

Wojcicki instead wanted to focus on the improvements the video company has made in the past few years.

“Two years ago there were a lot of articles, a lot of concerns about how we handle violent extremism. If you talk to people who are experts in this field, you can see that we’ve made tremendous progress.”

“We have a lot of tools, we work hard to understand what is happening on it and really work hard to enforce the work that we’re doing. I think if you look across the work you can see we’ve made tremendous progress in a number of these areas,” Wojcicki said. “If you were to fast-forward a couple years and say, well, what that would look like in 12 months and then in another 12 months, what are all the different tools that have been built, I think you’ll see there will be a lot of progress.”


Recode and Vox have joined forces to uncover and explain how our digital world is changing — and changing us. Subscribe to Recode podcasts to hear Kara Swisher and Peter Kafka lead the tough conversations the technology industry needs today.

More in Technology

Technology
The case for AI realismThe case for AI realism
Technology

AI isn’t going to be the end of the world — no matter what this documentary sometimes argues.

By Shayna Korol
Politics
OpenAI’s oddly socialist, wildly hypocritical new economic agendaOpenAI’s oddly socialist, wildly hypocritical new economic agenda
Politics

The AI company released a set of highly progressive policy ideas. There’s just one small problem.

By Eric Levitz
Future Perfect
Human bodies aren’t ready to travel to Mars. Space medicine can help.Human bodies aren’t ready to travel to Mars. Space medicine can help.
Future Perfect

Protecting astronauts in space — and maybe even Mars — will help transform health on Earth.

By Shayna Korol
Podcasts
The importance of space toilets, explainedThe importance of space toilets, explained
Podcast
Podcasts

Houston, we have a plumbing problem.

By Peter Balonon-Rosen and Sean Rameswaram
Technology
What happened when they installed ChatGPT on a nuclear supercomputerWhat happened when they installed ChatGPT on a nuclear supercomputer
Technology

How they’re using AI at the lab that created the atom bomb.

By Joshua Keating
Future Perfect
Humanity’s return to the moon is a deeply religious missionHumanity’s return to the moon is a deeply religious mission
Future Perfect

Space barons like Jeff Bezos and Elon Musk don’t seem religious. But their quest to colonize outer space is.

By Sigal Samuel