Skip to main content

The context you need, when you need it

When news breaks, you need to understand what actually matters — and what to do about it. At Vox, our mission to help you make sense of the world has never been more vital. But we can’t do it on our own.

We rely on readers like you to fund our journalism. Will you support our work and become a Vox Member today?

Join now

Why Google’s self-driving cars will be great for cyclists and pedestrians

Google’s self-driving car, during a trial in Washington, DC.
Google’s self-driving car, during a trial in Washington, DC.
Google’s self-driving car, during a trial in Washington, DC.
(KAREN BLEIER/AFP/GettyImages)

Last week, Matt McFarland of the Washington Post relayed an amusing anecdote about a cyclist on a fixie and one of Google’s self-driving cars.

As the Austin, Texas, cyclist wrote on a biking forum, he recently came to a four-way stop sign a moment after the Google autonomous car. To let the car go first, he did a track stand: a maneuver riders of fixed-gear bikes often do to stand in place without dismounting, which requires turning the front wheel back and forth. This can cause the bike to slightly rock forward and back.

The robotic car began to go, but as it did, the cyclist moved forward about an inch. The car interpreted this as the cyclist proceeding, and its algorithm forced it to stop abruptly. The cyclist stopped, and after a moment, the car began to move again, but then another subtle movement from the cyclist froze it in its tracks.

“We repeated this little dance for about two full minutes and the car never made it past the middle of the intersection,” the cyclist wrote. “The two guys inside were laughing and punching stuff into a laptop, I guess trying to modify some code to ‘teach’ the car something about how to deal with the situation.”

It’s tempting to interpret all this as a sign of the steep learning curve Google’s cars will encounter as they drive more in the complex conditions of the real world. But I think it shows something quite different.

Engineers will probably be able to teach the cars to distinguish between track stands and real movement fairly easily. But the cars will continue to drive with extreme caution and sensitivity, which is absolutely great news for cyclists and pedestrians.

Human are utterly terrible drivers

A stock photo, but all too real.

To date, Google’s cars have traveled nearly 2 million miles in California and Texas, but have only been involved in 14 minor collisions — all of which were other drivers’ fault. Google has detailed them, and they reveal the many alarming tendencies of human drivers.

Most of these crashes involve Google’s cars being rear-ended through no fault of their own. As Google’s Chris Urmson writes:

The most recent collision, during the evening rush hour on July 1, is a perfect example. One of our Lexus vehicles was driving autonomously towards an intersection in Mountain View, CA. The light was green, but traffic was backed up on the far side, so three cars, including ours, braked and came to a stop so as not to get stuck in the middle of the intersection. After we’d stopped, a car slammed into the back of us at 17 mph — and it hadn’t braked at all.

An animation of Google's self-driving car (gray) being rear-ended on July 1, based on data collected by the car.

It’s pretty clear what probably happened here: The driver was distracted, possibly looking down at a cellphone. If Google’s car were a cyclist, he or she might be dead.

Cellphones, of course, are a particularly big problem for safe driving. But the truth is that humans are pretty bad drivers in all sorts of ways.

As another Google post detailed, its cars frequently spot people driving on the wrong side of the road, dangerously turning across several lanes of traffic, and proceeding through intersections when there are still cars or cyclists in them. Human drivers zone out, miss cars in their blind spots, and often fail to spot bikes and pedestrians. This is part of why more than 30,000 people die in traffic crashes in the US each year.

Put simply, if a tech company introduced the human as a product engineered to drive cars, it’d go out of business.

Google’s cars don’t get bored or distracted

Google’s cars aren’t perfect yet. The real world is a very complicated driving environment, and the track stand story shows they still have a long way to go.

But when it comes to safety, they have the potential to outstrip human drivers in every way imaginable. Their algorithms don’t get bored, tired, or angry, and their 360-degree laser sensors mean they don’t have blind spots. Just as importantly, they’re seemingly programmed to always err on the side of excessive caution.

As a Mountain View, California, resident — who frequently interacts with these cars on the roads near Google’s headquarters — wrote in June, “Google cars drive like your grandma — they’re never the first off the line at a stop light, they don’t accelerate quickly, they don’t speed, and they never take any chances with lane changes (cut people off, etc.).” The resident described how the cars wait a few seconds after a pedestrian has completely cleared a crosswalk before beginning to turn through it.

And Google is clearly taking cyclists and pedestrians into account in the design of their cars’ algorithms. It’s specifically upgraded its software to navigate chaotic city streets, and earlier this year it patented a way for its cars to interpret bikers’ hand signals.

Anyone who’s spent much time walking or biking in the US is familiar with the danger human drivers pose when you don’t have the protection of a big metal shell around you. As someone who travels mainly by bike — and has experienced countless uncomfortably close passes and near-misses when drivers fail to see me — the idea of riding next to self-driving cars is way more appealing.

Future Perfect
The biggest drawback of driverless carsThe biggest drawback of driverless cars
Future Perfect

Driverless cars could save thousands of lives. They might also break our cities.

By Marina Bolotnikova
The Case for Growth
How America made it impossible to buildHow America made it impossible to build
Podcast
The Case for Growth

A system built to stop government from doing harm stopped it from doing anything.

By Sean Illing
Future Perfect
Is NYC’s controversial $9 toll working? The data is in.Is NYC’s controversial $9 toll working? The data is in.
Future Perfect

New York City’s congestion pricing experiment, explained in one chart.

By Pratik Pawar
Podcasts
What the government shutdown means for air travelWhat the government shutdown means for air travel
Podcast
Podcasts

“I literally can’t think of anything that’s been like that since 2001.”

By Hady Mawajdeh and Noel King
Future Perfect
A self-driving car traffic jam is coming for US citiesA self-driving car traffic jam is coming for US cities
Future Perfect

A century ago, cars remade America. Autonomous vehicles could do it again.

By David Zipper
Explain It to Me
The curse of America’s high-speed railThe curse of America’s high-speed rail
Podcast
Explain It to Me

Other countries have reliable trains that travel as fast as 200 mph. In the US...not so much.

By Jonquilyn Hill