Skip to main content

The context you need, when you need it

When news breaks, you need to understand what actually matters — and what to do about it. At Vox, our mission to help you make sense of the world has never been more vital. But we can’t do it on our own.

We rely on readers like you to fund our journalism. Will you support our work and become a Vox Member today?

Join now

Got the same name as a serial killer? Google might think you’re the same person.

It’s up to users to find out if Google is prioritizing wrong information about them in search results.

A photo illustration shows a pile of dollar bills behind a hand holding a smartphone screen displaying the Google logo.
A photo illustration shows a pile of dollar bills behind a hand holding a smartphone screen displaying the Google logo.
One man Recode spoke to said Google attached his professional headshot to a search result about a serial killer.
Budrul Chukrut/SOPA Images/LightRocket via Getty Images
Rebecca Heilweil
Rebecca Heilweil covered emerging technology, artificial intelligence, and the supply chain.

Hristo Georgiev recently received a troubling message from a friend: Google says he’s a serial killer. If you Googled his name, the search engine would serve up Georgiev’s professional headshot alongside a Wikipedia article about a Bulgarian serial killer with the same name who died in 1980. This is an unfortunate error, but it’s also not the first time Google’s algorithms have done something like this.

The actual Wikipedia article surfaced in Georgiev’s results didn’t include his headshot, and if you read carefully, you would quickly learn that the eponymous serial killer died by firing squad decades ago. Still, Google’s automated systems had made Georgiev, a software engineer based in Switzerland, appear to be someone he wasn’t. The company’s algorithms had plopped the information into one of its “knowledge panels,” the little boxes that appear on the top of search engine’s results and are supposed to offer a quick, authoritative answer to search queries so you don’t have to click through on results. But since Google debuted these panels in 2012, they have repeatedly promoted misinformation.

It took Google a few hours to fix the issue after several people reported it and it captured attention on the web forum Hacker News, Georgiev told Recode. Still, he says the wrong result, which may have been up for weeks, made him feel at least a “little uncomfortable.” He recalled that one person who searched for his name had a momentary “mini heart attack.”

The problem that led to Georgiev’s incorrect results can be traced back to Google’s Knowledge Graph, which the search engine calls a giant virtual encyclopedia of facts. “Organizing billions of entities in our Knowledge Graph requires that we use automated systems, which often work well but are not perfect,” a Google spokesperson told Recode. “We’re sorry for any issues this mix-up caused.”

Google has a formal process for flagging and removing inaccurate information in its knowledge panels, but it largely relies on users to catch when something is wrong. That’s left users responsible for noticing if Google is surfacing incorrect information about them in its top search result and then reporting the error back to the search platform. The company has launched a system for organizations and people to verify their identities with Google so they can more easily provide direct feedback to Google about the accuracy of the knowledge panel that relates to them.

Still, people in the past have complained that getting false information removed from these panels is a burdensome process, and others have said it can take months or even years. The Google spokesperson told Recode the company regularly reviews feedback about its Knowledge Graph results but did not comment on how often the company receives requests for changes.

Ultimately, the issue is part of Google’s broader problem: relying on algorithms to identify and offer the correct information doesn’t always work and can actually risk amplifying misinformation.

Google says its Knowledge Graph works by connecting pieces of information from around the web that are about a particular person, place, or thing — especially for notable people, places, and things. That’s more advanced and specific than surfacing results based only on keywords, as the company explained when it launched the tool in 2012. Google has used information collected from this system, which the company says includes 500 billion pieces of information concerning 5 billion entities, to curate the special sections of its search results the company calls knowledge panels. These boxes encourage internet searchers to stay on Google’s results page rather than clicking on results and visiting other websites.

Google’s Knowledge Graph focuses on surfacing different pieces of information from around the web that are about a particular topic or thing rather than looking for pieces of information that just include the same keywords.

Sometimes these panels make it faster and easier to get answers on Google. But Georgiev is just one of several people left frustrated when information about murderers pops up in knowledge panels when people search their names. Other times, these results will wrongly report that a person is married or dead. Even more concerning, these panels have elevated hateful content, as they did two years ago when a panel associated with the term “self-hating Jew” — an anti-Semitic term — attached an image of the Jewish comedian Sarah Silverman.

Eni Mustafaraj, a computer science professor at Wellesley, told Recode that these problems often result when a computer system mismatches information from two different sources — in the case of Georgiev, an image and a Wikipedia page.

At the same time, the incident makes clear how reliant Google’s knowledge panels are on the user-edited information on Wikipedia. “This kind of story is just a reminder of how dependent search engines generally are on what’s basically unpaid, uncompensated volunteer labor by a huge group of people from across the world,” Nicholas Vincent, a doctoral student at Northwestern who has studied search engines, told Recode.

Google says mistakes are rare in these panels, but it seems like it’s still up to humans to catch and fix their errors. “You’re working with massive amounts of information, right?” Georgiev told Recode. “There are errors bound to happen. It’s just that when you’re Google, you have to be really careful with this kind of stuff.”

In the meantime, he says, you should start Googling yourself.

More in Technology

Technology
The case for AI realismThe case for AI realism
Technology

AI isn’t going to be the end of the world — no matter what this documentary sometimes argues.

By Shayna Korol
Politics
OpenAI’s oddly socialist, wildly hypocritical new economic agendaOpenAI’s oddly socialist, wildly hypocritical new economic agenda
Politics

The AI company released a set of highly progressive policy ideas. There’s just one small problem.

By Eric Levitz
Future Perfect
Human bodies aren’t ready to travel to Mars. Space medicine can help.Human bodies aren’t ready to travel to Mars. Space medicine can help.
Future Perfect

Protecting astronauts in space — and maybe even Mars — will help transform health on Earth.

By Shayna Korol
Podcasts
The importance of space toilets, explainedThe importance of space toilets, explained
Podcast
Podcasts

Houston, we have a plumbing problem.

By Peter Balonon-Rosen and Sean Rameswaram
Technology
What happened when they installed ChatGPT on a nuclear supercomputerWhat happened when they installed ChatGPT on a nuclear supercomputer
Technology

How they’re using AI at the lab that created the atom bomb.

By Joshua Keating
Future Perfect
Humanity’s return to the moon is a deeply religious missionHumanity’s return to the moon is a deeply religious mission
Future Perfect

Space barons like Jeff Bezos and Elon Musk don’t seem religious. But their quest to colonize outer space is.

By Sigal Samuel