Skip to main content

The context you need, when you need it

When news breaks, you need to understand what actually matters — and what to do about it. At Vox, our mission to help you make sense of the world has never been more vital. But we can’t do it on our own.

We rely on readers like you to fund our journalism. Will you support our work and become a Vox Member today?

Join now

Here’s Apple’s best argument against hacking the San Bernardino shooter’s iPhone

Supporters rally at Apple stores against government interference into iPhones.
Supporters rally at Apple stores against government interference into iPhones.
Supporters rally at Apple stores against government interference into iPhones.
Bryan Thomas/Getty Images

Apple has filed its official reply to the FBI’s demand that it write software to help the bureau access the iPhone of San Bernardino terrorism suspect Syed Farook. Consistent with its earlier public statements, Apple condemns the FBI’s request as an unprecedented expansion of government power that would endanger the privacy of Apple users. I expressed some skepticism of this argument last week, but I found Apple’s latest response pretty persuasive.

One danger is that an FBI win could set a legal precedent that puts us on a slippery slope to the routine use of smartphones for government surveillance. If Apple can be compelled to update the software on a dead suspect’s iPhone to help the FBI access encrypted data, it’s not obvious why the company couldn’t also be compelled to modify the software on a live suspect’s iPhone to listen in on his conversations or track his location.

But let’s say you’re confident that the courts can draw a reasonable distinction between the FBI’s current request to unlock an encrypted iPhone and more outlandish surveillance scenarios. There’s still a big practical problem that the FBI and its allies haven’t really grappled with.

iPhone hacking software will be a magnet for bad guys

Every new version of the iPhone operating system goes through an extensive testing process before it’s signed by Apple. That’s critical because an iPhone will refuse to install software updates that haven’t been mathematically blessed by Apple. Given how dangerous it would be if someone tricked Apple into signing malware, we can assume that Apple takes extreme precautions to make sure that doesn’t happen.

But it will be hard for Apple to maintain high security standards if it is forced to create the hacking software the FBI has demanded. In its legal filing, Apple notes that “there are hundreds of demands to create and utilize the software waiting in the wings. If Apple creates new software to open a back door, other federal and state prosecutors — and other governments and agencies — will repeatedly seek orders compelling Apple to use the software to open the back door for tens of thousands of iPhones.”

Manhattan District Attorney Cyrus Vance alone has said there are at least 175 iPhones he’d like to have unlocked.

Apple argues that if it is forced to create a hacked version of its software, that would “force Apple to take on the task of unfailingly securing against disclosure or misappropriation the development and testing environments, equipment, codebase, documentation, and any other materials relating to the compromised operating system.”

The statement continued: “Given the millions of iPhones in use and the value of the data on them, criminals, terrorists, and hackers will no doubt view the code as a major prize and can be expected to go to considerable lengths to steal it, risking the security, safety, and privacy of customers whose lives are chronicled on their phones.”

We can assume that once Apple created the iPhone hacking software the FBI wants, hackers and foreign intelligence agencies would work to gain access to it. We can also expect that foreign governments will try to hack into the database Apple uses to track these requests — just as China hacked into a Google database that tracked Gmail surveillance requests. They might try to impersonate obscure local law enforcement agencies and submit forged court orders. Apple would need to hire new employees to process and investigate the growing volume of law enforcement requests — which would mean more people who could be subject to bribery or blackmail.

Apple is a sophisticated company with a lot of security expertise. Maybe it will be able to withstand all these attacks and emerge with users’ privacy unscathed. But it’s a pretty big risk to take.


Here’s the situation between Apple and the FBI

More in Technology

Politics
The Supreme Court will decide when the police can use your phone to track youThe Supreme Court will decide when the police can use your phone to track you
Politics

Chatrie v. United States asks what limits the Constitution places on the surveillance state in an age of cellphones.

By Ian Millhiser
Future Perfect
The simple question that could change your careerThe simple question that could change your career
Future Perfect

Making a difference in the world doesn’t require changing your job.

By Bryan Walsh
Technology
The case for AI realismThe case for AI realism
Technology

AI isn’t going to be the end of the world — no matter what this documentary sometimes argues.

By Shayna Korol
Politics
OpenAI’s oddly socialist, wildly hypocritical new economic agendaOpenAI’s oddly socialist, wildly hypocritical new economic agenda
Politics

The AI company released a set of highly progressive policy ideas. There’s just one small problem.

By Eric Levitz
Future Perfect
Human bodies aren’t ready to travel to Mars. Space medicine can help.Human bodies aren’t ready to travel to Mars. Space medicine can help.
Future Perfect

Protecting astronauts in space — and maybe even Mars — will help transform health on Earth.

By Shayna Korol
Podcasts
The importance of space toilets, explainedThe importance of space toilets, explained
Podcast
Podcasts

Houston, we have a plumbing problem.

By Peter Balonon-Rosen and Sean Rameswaram