Skip to content

Discerning Data

  • About Us
  • Additional Resources
  • Contact Us

DISCERNING DATA

A Faegre Drinker Blog Covering the Latest in Privacy, Cybersecurity and Data Strategy

  • Privacy
  • Cybersecurity
  • Data Strategy
  • Disruptionware

Georgetown Law Center Releases Report on Biometric Face Scans at Airport Departure Gates

Share

The Georgetown Law Center for Privacy & Technology released a report that takes a harsh look at the Department of Homeland Security (DHS)’s “Biometric Exit” program.  The “Not Ready for Takeoff: Face Scans at Airport Departure Gates” report  highlights the myriad number of privacy and fairness issues associated with the use of biometric data for screening and other purposes.   The Biometric Air Exit program uses biometric data to verify travelers’ identities as they leave the U.S. and has been deployed at Boston’s Logan International Airport and eight other airports.  The program is operated by DHS and uses photographs of passengers taken at the gate while boarding to verify travelers’ identities as they leave the country.  Prior to departure of an outbound international flight, DHS prepopulates the Traveler Verification Service (TVS) with biometric templates from the travelers expected on the flight.  TVS either confirms the travelers face or rejects the face as a “non-match.”  Non-matched travelers credentials will then be checked manually.

The Georgetown Law report takes issue with DHS’s claim that the program is designed to detect “visa overstay travel fraud” noting that the problems associated with visa overstay fraud have not been properly established and therefore do not necessitate such a resource-intensive solution. Visa overstay fraud occurs when a foreign national who wishes to remain in the U.S. past the expiration of his visa arranges to have a conspirator leave the country in his place using the visa holder’s credentials, which creates an exit record. In addition, the report questions whether the program complies with federal law because it has not been specifically authorized by Congress and DHS has not engaged in an appropriate rulemaking proceeding.

The challenges associated with measuring the effectiveness of facial scanning programs are highlighted in the report, which noted that since February 2017 the National Institute of Standards and Technology (NIST) has tested more than 35 different face recognition algorithms designed to verify identities.  That research indicates that face recognition systems have a hard time distinguishing among people who like alike which could lead to falsely matching individuals who are similar in appearance.  This suggests that such programs may not perform well if the goal is to screen for imposters.   Finally, the report notes that DHS relies heavily on airlines and technology vendors for the central components of the program and recommends that airlines and other vendors become aware of the potential privacy and fairness issues associated with biometric screening.

The use of biometric data for a variety of purposes will likely only increase in the near future.  We will continue to monitor this topic and provide updates.

The material contained in this communication is informational, general in nature and does not constitute legal advice. The material contained in this communication should not be relied upon or used without consulting a lawyer to consider your specific circumstances. This communication was published on the date specified and may not include any changes in the topics, laws, rules or regulations covered. Receipt of this communication does not establish an attorney-client relationship. In some jurisdictions, this communication may be considered attorney advertising.

Receive Email Alerts to New Articles

SUBSCRIBE

January 9, 2018
Written by: Discerning Data Editorial Board
Category: Cybersecurity, NIST
Tags: Biometrics, department of homeland security, NIST

Post navigation

Previous Previous post: CMS Confirms Policy on Texting Patient Information among Healthcare Providers
Next Next post: Article 29 Working Party Releases Guideline WP259 on Consent under the GDPR

Search the Blog

Sign Up for Email Alerts

PODCASTS

Faegre Drinker on Law and Technology

©2023 Faegre Drinker Biddle & Reath LLP. All Rights Reserved. Lawyer Advertising.

  • About Us
  • Additional Resources
  • Contact Us
We use cookies to improve your experience with our website. By browsing our site, you are agreeing to the use of cookies. For more information about how we use cookies, please review our privacy policy and cookie policy. OK
Privacy & Cookies Policy

Privacy Overview

This website uses cookies to improve your experience while you navigate through the website. Out of these, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may affect your browsing experience.
Necessary
Always Enabled
Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website. These cookies do not store any personal information.
Non-necessary
Any cookies that may not be particularly necessary for the website to function and is used specifically to collect user personal data via analytics, ads, other embedded contents are termed as non-necessary cookies. It is mandatory to procure user consent prior to running these cookies on your website.
SAVE & ACCEPT