Skip to main content

The context you need, when you need it

When news breaks, you need to understand what actually matters — and what to do about it. At Vox, our mission to help you make sense of the world has never been more vital. But we can’t do it on our own.

We rely on readers like you to fund our journalism. Will you support our work and become a Vox Member today?

Join now

The US government says Facebook’s ad business creates housing discrimination

The Department of Housing and Urban Development has issues with Facebook’s targeted ad business.

Houses for sale in Chicago.
Houses for sale in Chicago.
Houses for sale in Chicago.
Scott Olson/Getty Images

Facebook has a $55 billion annual advertising business in part because it lets advertisers pick and choose, with precise detail, who they want to target their advertisements to.

That’s great if you’re Coca-Cola, and historically it’s been great for Facebook. But the Department of Housing and Urban Development, commonly referred to as HUD, said Thursday that Facebook’s targeting actually creates some serious problems.

Specifically: HUD claims Facebook’s ad platform is “causing housing discrimination,” and can “exclude people” from seeing certain ads based on traits that are defined by HUD as “protected characteristics,” like race, national origin, and religion. Twitter’s and Google’s ad businesses may be creating the same problem.

“Facebook is discriminating against people based upon who they are and where they live,” HUD Secretary Ben Carson said in a press release posted to the HUD website. “Using a computer to limit a person’s housing choices can be just as discriminatory as slamming a door in someone’s face.”

You can read HUD’s full charge against Facebook here, but the press release outlines the main accusations. The big problem is that Facebook uses its own algorithms to determine who should see which ads, and HUD claims those algorithms can unintentionally exclude groups of people with similar protected characteristics just because Facebook’s systems don’t necessarily deem them a good match for the ad.

Imagine a housing developer wants to promote fancy, new condos in San Francisco on Facebook. The developer sets the targeting parameters in a way that means there are one million Facebook users who could see the ad, but the developer only pays to reach 100,000 of those people. Facebook then determines which 100,000 people to show the ad to based on which people it thinks may find the ad most relevant. That means, though, that Facebook’s algorithms could prioritize certain groups of people over another, and HUD claims those groupings may be created using data about protected characteristics.

“Facebook combines data it collects about user attributes and behavior with data it obtains about user behavior on other websites and in the non-digital world,” the press release reads. “Facebook then allegedly uses machine learning and other prediction techniques to classify and group users to project each user’s likely response to a given ad, and in doing so, may recreate groupings defined by their protected class.”

Facebook says it’s been working with HUD to solve the issue, but that the two sides came to a roadblock when HUD asked for data about Facebook’s users and targeting the company refused to hand over. Here’s Facebook’s full statement:

We’re surprised by HUD’s decision, as we’ve been working with them to address their concerns and have taken significant steps to prevent ads discrimination. Last year we eliminated thousands of targeting options that could potentially be misused, and just last week we reached historic agreements with the National Fair Housing Alliance, ACLU, and others that change the way housing, credit, and employment ads can be run on Facebook. While we were eager to find a solution, HUD insisted on access to sensitive information - like user data - without adequate safeguards. We’re disappointed by today’s developments, but we’ll continue working with civil rights experts on these issues.

This is far from the first ad snafu for Facebook in the past 18 months. The company has been called out multiple times for letting advertisers target people based on keywords like “jew haters” and “Joseph Goebbels.” Then, of course, there was the 2016 US presidential election in which Russia used targeted Facebook ads to try and sway voter opinion ahead of the election.

The bigger concern for Facebook will be if any of its ad practices lead to serious regulatory problems. The company is being investigated by the FTC (and other government agencies) for Cambridge Analytica, a situation in which personal data from millions of Facebook users was collected and later sold by people outside the company without users’ knowledge.

The claims against Facebook by government agencies are piling up, and it’s only a matter of time before a shoe drops — it’s just unclear what shoe it will be, and how much it will hurt.

This article originally appeared on Recode.net.

More in Technology

Technology
The case for AI realismThe case for AI realism
Technology

AI isn’t going to be the end of the world — no matter what this documentary sometimes argues.

By Shayna Korol
Politics
OpenAI’s oddly socialist, wildly hypocritical new economic agendaOpenAI’s oddly socialist, wildly hypocritical new economic agenda
Politics

The AI company released a set of highly progressive policy ideas. There’s just one small problem.

By Eric Levitz
Future Perfect
Human bodies aren’t ready to travel to Mars. Space medicine can help.Human bodies aren’t ready to travel to Mars. Space medicine can help.
Future Perfect

Protecting astronauts in space — and maybe even Mars — will help transform health on Earth.

By Shayna Korol
Podcasts
The importance of space toilets, explainedThe importance of space toilets, explained
Podcast
Podcasts

Houston, we have a plumbing problem.

By Peter Balonon-Rosen and Sean Rameswaram
Technology
What happened when they installed ChatGPT on a nuclear supercomputerWhat happened when they installed ChatGPT on a nuclear supercomputer
Technology

How they’re using AI at the lab that created the atom bomb.

By Joshua Keating
Future Perfect
Humanity’s return to the moon is a deeply religious missionHumanity’s return to the moon is a deeply religious mission
Future Perfect

Space barons like Jeff Bezos and Elon Musk don’t seem religious. But their quest to colonize outer space is.

By Sigal Samuel