This Tuesday Apple unveiled a new line of phones to much fanfare, but one feature immediately fell under scrutiny: FaceID, a tool that would use facial recognition to identify individuals and unlock their phones.
Jake Laperruque (@jakelaperruque) is senior counsel for privacy and security issues at The Constitution Project. He previously served as a fellow for New America’s Open Technology Institute and The Center for Democracy and Technology.
Unsurprisingly, this raised major anxiety about consumer privacy given its profound ramifications: Retailers already crave facial recognition to monitor consumers, and without legally binding terms, Apple could use FaceID to track consumer patterns at its stores, or develop and sell data to others. It’s also possible that police would be able to more easily unlock phones without consent by simply holding an individual’s phone up to his or her face.
But FaceID should create fear about another form of government surveillance: mass scans to identify individuals based on face profiles. Law enforcement is rapidly increasing use of facial recognition; one in two American adults are already enrolled in a law enforcement facial recognition network, and at least one in four police departments have the capacity to run face recognition searches . But until now, co-opting consumer platforms hasn’t been an option. While Facebook has a powerful facial recognition system, it doesn’t maintain the operating systems that control the cameras on phones, tablets, and laptops that stare at us every day. Apple’s new system changes that. For the first time, a company will have a facial recognition system with millions of profiles, and the hardware to scan and identify faces throughout the world.
And this could in theory make Apple an irresistible target for a new type of mass surveillance order. The government could issue an order to Apple with a set of targets and instructions to scan iPhones, iPads, and Macs to search for specific targets based on FaceID, and then provide the government with those targets’ location based on the GPS data of devices’ that receive a match. Apple has a good record of fighting for user privacy, but there’s only so much the company could do if its objections to an order are turned down by the courts. (On Wednesday Sen. Al Franken (D-Minnesota) released a letter to Apple CEO Tim Cook, asking how the company will handle the technology’s security and private implications.)
Over the last decade the government has increasingly embraced this type of mass scan method. Edward Snowden’s disclosures revealed the existence of Upstream, a program under FISA Section 702 (set to expire in just a few months). With Upstream, the NSA scans all internet communications going into and out of the United States for surveillance targets’ emails, as well as IP addresses and what the agency has called cybersignatures. And last year Reuters revealed that Yahoo, in compliance with a government order, built custom software to scan hundreds of millions of email accounts for content that contained a digital signature used by surveillance targets.
To many these mass scans are unconstitutional and unlawful, but that has not stopped the government from pursing them. Nor have those concerns prevented the secretive FISA Court from approving the government’s requests, all too often with the public totally unaware that mass scans continue to sift through millions of Americans’ private communications.
Until now text has been the focus of mass scan surveillance, but Apple and FaceID could change that. By generating millions of face prints while simultaneously controlling the cameras that can scan and identify them, Apple might soon face a government order to turn its new unlocking system into the killer app for mass surveillance.
What should Apple—and the rest of us—do to respond to this risk? First, Apple should take every step possible to insulate itself from an overly broad government order to conduct mass scans for faces. Face prints developed through FaceID should be stored only locally on devices, and should be fully encrypted so that the company cannot access them remotely, even if legally compelled to surreptitiously take control of an iPhone.
However, the unresolved fight between Apple and the FBI over encryption makes this an unreliable remedy. Therefore, Apple should also update its Transparency Reports to include data on whether it receives orders to turn over facial recognition profiles, or to conduct facial recognition scans, leaving a so-called warrant canary to serve as an alarm bell if it receives a troubling order related to FaceID in the future.
Finally, and more broadly, the public should demand that Congress rein in government’s ever-growing affinity for mass scan surveillance. Limiting or outlawing the controversial Upstream program when the authority it’s based on expires this December would be an excellent start, but facial recognition scans may soon be as big a component of mass surveillance, and the public need to be ready.
WIRED Opinion publishes pieces written by outside contributors and represents a wide range of viewpoints. Read more opinions here.