Facial Recognition Takes Off At Airports. Privacy Experts Want It Grounded.
Bill O’Leary/Getty Images
In the not too distant future, you could walk into the international terminal of almost any US airport and board a plane without showing anyone your passport.
At the check-in counter, you’d pose in front of a camera that scans your face and sends your image to a remote system that matches it to a stored copy of your passport photo. You’d have your photo taken again at the security line, and again at the gate. If everything still matched up, you’d be on board, quietly warring with your seat mate over the armrest.
James Martin/CNET
Part of that automated future has already rolled out. The US Customs and Border Protection program, called Biometric Exit, includes a face-matching system and is used at departure gates in 17 airports in the US.
And that’s just the start. The agency plans to have the system scan 97 percent of all outbound international travelers by 2021. Airlines and the Transportation Security Agency also are testing facial recognition cameras throughout airports, meaning you might someday be able to travel without interacting with another human being at all.
“US Customs and Border Protection is changing the face of travel with its cloud-based facial biometric matching service,” the agency says in a pamphlet explaining the technology. “This matching service is envisioned to replace the need to manually check paper travel documents by providing an automated identity verification process everywhere a traveler shows their travel document across every step in the travel continuum.”
There may be no more dramatic example of the tension between convenience and privacy inherent in facial recognition than the prospect of giving up your identity to clear through security faster. That benefit, after all, comes at a cost. Academic research has shown that facial recognition algorithms have error rates that vary depending on a person’s race or gender, meaning some groups could face extra screening more often than others. The technology can be used without your knowledge. And the unalterable data that facial recognition systems collect — an image of your face — raises concerns that your movements can be tracked over the course of your life if the records are kept indefinitely.
CBP says facial recognition technology has the potential to make travel both more convenient and more secure because it creates a digital template that’s unique to you. Machines are getting faster at matching faces and, CBP says, do a better job than humans do.
But critics say CBP has already pushed — and possibly broken — the boundaries of the US law. One specific complaint: The technology has debuted in airports without being subject to a public comment period.
The potential for multiple government agencies to track you using facial recognition is real, says Jeramie Scott, senior counsel at the Electronic Privacy Information Center. In recently released documents his organization obtained, CBP noted that other government agencies with an interest in the photos of foreign nationals gathered at airports are Immigration and Customs Enforcement and the US Coast Guard.
“When you create the infrastructure for widespread use of facial recognition, people will find additional ways to use it,” Scott said.
CBP says the program isn’t intended for surveillance and was implemented in accordance with the law. US citizens can opt out, and the agency doesn’t keep their images long term.
How it works
The system is designed to be a snap: The airport or airline you’re flying with takes your picture at the gate, sending the encrypted image to CBP’s Traveler Verification Service system, which runs on a cloud server. There, CBP’s face-matching algorithm confirms that the person in the image is the same one that’s in your passport photo.
The system does this by creating a biometric template based on the passenger’s photograph. The template is a set of measurements of the size and shape of features, like eyes, and the distance between features, like your nose and upper lip. The system compares that template to a preloaded gallery of passenger photos, pulled from passports and other sources.
Facial recognition 101: Everything you need to know about face-matching tech
Airlines including JetBlue, British Airways and Delta, along with organizations that run airports in cities like Miami and San Jose, California, have already partnered with CBP to implement the system. The airlines or airports own the cameras and take passenger photos. In Atlanta, Delta Air Lines has introduced facial recognition checks throughout its international terminal in partnership with CBP and TSA.
CBP says it keeps photos of US citizens in the Traveler Verification Service system for 12 hours, and photos of noncitizens for 14 days. It also sends photos of noncitizens to the US Department of Homeland Security IDENT database, which stores information for 75 years on visitors to the US.
The airlines and airports aren’t allowed to keep copies of the photos and must immediately purge them from their systems, according to CBP. However, they’re allowed to use other photos they take with the same cameras for commercial purposes. That means they could take a second photo and use it in their own facial recognition system to target ads to you.
Airlines and airports are required to tell CBP if they plan to use photos for commercial purposes. So far, none has. JetBlue and Delta said separately that they don’t have plans to use facial recognition for commercial purposes, and added that their cameras capture images only when a passenger stands in front of the camera and actively triggers the scan. A spokeswoman for Mineta San Jose Airport says it only facilitates the CBP program and doesn’t have access to the photos.
Legal authority
The program has raised questions about its legality. The American Civil Liberties Union, the Electronic Privacy Information Center, the Electronic Frontier Foundation and other organizations say there’s no law that allows CBP to collect biometric information on US citizens, regardless of how long it’s stored. What’s more, they say, the agency’s choice to use facial recognition, rather than biometrics like fingerprints, is unnecessarily invasive.
CBP says a number of laws allow it to gather biometric information. For example, the law that established the Department of Homeland Security, CBP’s parent agency, gives it the authority to use technology to get all the data it needs for the biometric entry and exit program.
Congress first ordered the collection of biometric data from foreign nationals entering and exiting the country in 1996. It ordered the creation of a biometric entry-exit program in 2002 and authorized funds for it in 2016. However, privacy law experts point out that while some of the laws CBP cites apply specifically to noncitizens, none of them explicitly references situations involving citizens.
“US citizens have been conspicuously absent from the statutory text of every law under this program for the last 14 years,” according to a 2017 report by the Georgetown Law Center on Privacy & Technology.
If a US citizen opts out of the Biometric Exit program, the passenger can have his or her travel documents and passport checked by an airline employee, CBP says. If something doesn’t check out, the airline can ask a CBP officer for assistance.
Bill O’Leary/Getty Images
Agencies like CBP typically go through a rule-making process to explain how new programs implement the underlying law and get feedback from the public. That hasn’t happened with the Biometric Exit program.
In a proposed rule, CBP wants to require US citizens to submit to facial recognition both at boarding and re-entry to the US.
Simply asking people whether they’re citizens before requiring them to submit to facial recognition would lead to people lying, CBP says. The agency says that biometric facial recognition is more accurate than humans checking passport photos against a person’s face, so it’s better at catching people traveling fraudulently with US passports. The system caught three imposters at Washington Dulles Airport in a period of 40 days, according to CBP.
Error rates
Academic studies have shown that some facial recognition algorithms are less accurate for some groups of people depending on their skin color and gender. One study found that commercial facial recognition products from companies like Amazon and Microsoft had higher error rates for black women. Others were more likely to give false negatives to white men, according to research from the National Institute of Standards and Technology.
A false negative — the algorithm incorrectly says your face doesn’t match your photo — has the potential to make travel more inconvenient for legitimate passengers. It would be unfair for some groups of people to face additional screening more often than others, advocates say.
“We’re talking about something that discriminates based off of what you look like,” said Mana Azarmi, policy council at the Center for Democracy and Technology.
Without oversight, it would be hard to know whether the system has higher error rates for some groups of people. CBP is working with NIST and the DHS Science and Technology Directorate to monitor problems, the agency says. CBP hasn’t revealed its false negative or other error rates, or how they affect different groups. CBP told Buzzfeed that it confirms 98.6 percent of passengers who go through the Biometric Exit system.
Face forward
Even if CBP had full legal authority to collect biometric information from all passengers before they leave the country, privacy advocates think facial recognition isn’t the right biometric to collect.
That’s because facial recognition can be conducted without targets realizing it. The technology doesn’t require physical contact, as fingerprints do, and it’s progressing so that low-quality photos taken from the side are sufficient to identify someone. Once people realize that facial recognition is unavoidable at airports, they may become discouraged from traveling and taking part in political activity, like the 2017 protests in US airports against travel bans, advocates say.
In a November report on the program, CBP said the system is preferable because passengers experience it as being less invasive than fingerprints.
But perception isn’t what matters, said Neema Singh Guliani, a senior legislative counsel at the ACLU.
“These are programs that have such an extreme effect on people’s rights,” she said, “and a process that’s not transparent.”
READ MORE HERE