Skip to main content

The context you need, when you need it

When news breaks, you need to understand what actually matters — and what to do about it. At Vox, our mission to help you make sense of the world has never been more vital. But we can’t do it on our own.

We rely on readers like you to fund our journalism. Will you support our work and become a Vox Member today?

Join now

On Election Day, the Cambridge Analytica whistleblower is blasting Facebook for still not doing enough

Christopher Wylie knows a bit about voter manipulation.

Christopher Wylie onstage at Web Summit 2018
Christopher Wylie onstage at Web Summit 2018
Christopher Wylie
Seb Daly/Web Summit via Getty Images

The whistleblower who sounded the alarm about Facebook’s data vulnerabilities in the 2016 election cycle isn’t done yelling — even on Election Day 2018.

Christopher Wylie blasted the social network on Tuesday as insufficiently attentive to the problems first revealed in the last election. Calling out Facebook for “making a digital clone of our society,” Wylie described Facebook as similar to the European giant that centuries ago plundered the resources of its colonial subjects.

“This is a story of colonialism. Facebook is our generation’s East India Company,” Wylie said during an onstage interview at the Web Summit in Lisbon, Portugal. “The problem is that our government is not equipped to handle this.”

He was unsparing of U.S. politicians asked to keep that plunderer in check. While he said he did not regret coming forward, Wylie was clearly agitated by the lack of action in the aftermath of his bombshell revelations.

“We can regulate nuclear power,” he said. “Why can’t we regulate some fucking code?”

Wylie, of course, knows a bit about voter manipulation. In the run-up to the 2016 presidential election, Wylie worked at a data analytics firm called Cambridge Analytica — that Cambridge Analytica, the same firm that collected Facebook data from millions of people for profiling, and that some believe helped Donald Trump get elected president.

Wylie’s decision to go public with that information has caused a year of nightmares for Facebook. Cambridge Analytica has since shut down.

Wylie says he saw how Facebook data was collected and used to create psychological profiles of potential voters. It was personal information that made it easier for those Facebook users to be manipulated or pushed toward a particular political view.

Coupling that profile information with Facebook’s algorithms — the software used to determine what you see and don’t see in your Feed — can be dangerous, he said.

“When you look at what the alt-right is and what the role of Cambridge Analytica was in catalyzing the alt-right — it’s an insurgency. It was built to be an insurgency,” Wylie said. “People who were vulnerable to disinformation were profiled and targeted using the same kinds of techniques and tactics the military would use against ISIS.”

Wylie says that the game plan was to find potential supporters of alt-right causes and encourage them to visit alt-right pages or groups on Facebook. That would signal to Facebook that these people wanted to see more of that type of content, creating a feedback loop.

“Facebook’s algorithms were, at least at the time, very sensitive,” he added. “So if you brought people onto Pages, the News Feed would change. And Facebook would do half of the work for you.”

Facebook has spent the better part of the past two years trying to avoid this from happening ahead of Tuesday’s 2018 midterms elections. It has not only tweaked its algorithm to favor friend updates over those of publishers and pages, but it’s still taking down organized efforts from other countries trying to spread this divisive content.

Facebook removed a group of 115 Facebook and Instagram accounts late Monday night that U.S. law enforcement agencies “believe may be linked to foreign entities.” Just 24 hours before the U.S. elections, there are still foreign groups trying to manipulate voters on Facebook.

This article originally appeared on Recode.net.

More in Technology

Technology
The case for AI realismThe case for AI realism
Technology

AI isn’t going to be the end of the world — no matter what this documentary sometimes argues.

By Shayna Korol
Politics
OpenAI’s oddly socialist, wildly hypocritical new economic agendaOpenAI’s oddly socialist, wildly hypocritical new economic agenda
Politics

The AI company released a set of highly progressive policy ideas. There’s just one small problem.

By Eric Levitz
Future Perfect
Human bodies aren’t ready to travel to Mars. Space medicine can help.Human bodies aren’t ready to travel to Mars. Space medicine can help.
Future Perfect

Protecting astronauts in space — and maybe even Mars — will help transform health on Earth.

By Shayna Korol
Podcasts
The importance of space toilets, explainedThe importance of space toilets, explained
Podcast
Podcasts

Houston, we have a plumbing problem.

By Peter Balonon-Rosen and Sean Rameswaram
Technology
What happened when they installed ChatGPT on a nuclear supercomputerWhat happened when they installed ChatGPT on a nuclear supercomputer
Technology

How they’re using AI at the lab that created the atom bomb.

By Joshua Keating
Future Perfect
Humanity’s return to the moon is a deeply religious missionHumanity’s return to the moon is a deeply religious mission
Future Perfect

Space barons like Jeff Bezos and Elon Musk don’t seem religious. But their quest to colonize outer space is.

By Sigal Samuel