Skip to main content

The context you need, when you need it

When news breaks, you need to understand what actually matters — and what to do about it. At Vox, our mission to help you make sense of the world has never been more vital. But we can’t do it on our own.

We rely on readers like you to fund our journalism. Will you support our work and become a Vox Member today?

Join now

Are we automating racism?

Algorithms don’t fail everyone equally.

Joss Fong
Joss Fong is a founding member of the Vox video team and a producer focused on science and tech. She holds a master’s degree in science, health, and environmental reporting from NYU.

Humans are flawed decision-makers. Decades of research show that we’re bad with numbers and easily influenced by cognitive and social biases that operate beneath our awareness. Racism still creeps into human decisions about hiring, housing, and credit.

As a result, many of us are tempted to think we’re better off having machines make inferences directly from data. But that turns out to be a dangerous assumption, as we explore in this episode of Glad You Asked.

The many subjective choices that data scientists make as they select and structure training data can increase or decrease racial bias in machine learning systems. After all, engineers are humans with biases and blind spots like everyone else.

For other data sets, historical discrimination and systematic inequality will color the data no matter how diligent the collection process. Data on crime, for instance, is ultimately derived from the choices law enforcement officers make on which neighborhoods to patrol and who to arrest. All AI writing systems are trained on human writing samples, which reflect the perspectives of the dominant writing groups.

But even if we assume the data is unbiased, machine learning systems are designed to minimize error rates, and they’ll do that without any regard to fairness (unless programmed otherwise). That means they tend to “care” more about accuracy on majority (literally larger) groups than minorities, to the extent that the groups differ on a predictive attribute.

You’ve surely heard that correlation isn’t causation. Well, AI systems are correlation machines. They can be accurate for the wrong reasons. So it’s one thing to use them to improve weather forecasts and search results; it’s quite another to deploy them in decisions about the lives of individual human beings.

None of this is to say that these problems can’t be overcome. Some predictive models may be less biased than the status quo decision-maker. But the assumption that these systems are necessarily more objective is clearly wrong. And since AI systems are both opaque and highly scalable, they demand a level of scrutiny that we haven’t even begun to ensure.

You can find this video and all of Vox’s videos on YouTube. Subscribe for more.

Further reading:

Scrutinizing Saliency Based Image Cropping” by Vinay Prabhu

Gradio Saliency Cropping app

Dissecting Racial Bias in an Algorithm Used to Manage the Health of Populations,” Science

Model Cards for Model Reporting” ArXiv

Race After Technology by Ruha Benjamin

Weapons of Math Destruction by Cathy O’Neil

The Ethical Algorithm by Michael Kearns and Aaron Roth

Why Algorithms Can Be Racist and Sexist,” Recode

How Big Data Is Unfair” by Moritz Hardt

The Problem With ‘Biased Data’” by Harini Suresh

Big Data’s Disparate Impact” by Solon Barocas and Andrew D. Selbst

Fairness in Algorithmic Decision-Making” by Mark MacCarthy

More in Video

Video
Why Americans can’t escape credit card debtWhy Americans can’t escape credit card debt
Play
Video

Credit card APRs are now as high as 20 percent.

By Frank Posillico
Video
Why some couples are happier living apartWhy some couples are happier living apart
Play
Video

This growing relationship trend might change the way you think about living with your romantic partner.

By Gina Pollack
Video
The strange myth behind carrots and night visionThe strange myth behind carrots and night vision
Play
Video

How we fell for World War II propaganda.

By Nate Krieger
Video
Are team sports the secret to living longer?Are team sports the secret to living longer?
Play
Video

How a basketball league for “grannies” is reimagining aging.

By Benjamin Stephen
Video
How Georgia manufactured the Peach State mythHow Georgia manufactured the Peach State myth
Play
Video

It was never really about the fruit.

By Frank Posillico
Video
How smart design can benefit senior livingHow smart design can benefit senior living
Play
Video

And why it matters for retirement communities.

By Lindsey Sitz