TSA SCORECARD

The Transportation Security Administration (TSA) has expanded facial recognition technology to at least 25 airports across the U.S. We are collecting information on your experience with facial recognition at a TSA checkpoint. This Algorithmic Justice League (AJL) survey will help us better understand your experience with facial recognition at airport checkpoints. #InPlaneSight

Know Your Rights

TSA agents must inform passengers of their rights, and there must be clearly visible signage notifying passengers of their ability to proceed without a facial identification scan.

Facial identification scans are not mandatory. Travelers opting out of this program should not encounter additional consequences or additional screenings, pat-downs, interrogations, or even detention, beyond what they would have encountered at a non-facial recognition airport..

Documented Issues & Concerns

Thousands of people daily are feeling forced to decide whether to travel or safeguard the privacy of their faces.

The first customer benefit

Racial Discrimination

A 2019 study by the National
Institute of Standards and Technology tested photos of over 8 million people and found that Asian
and African American people were up to 100 times more likely to be misidentified than White men by facial
recognition technology.

The second customer benefit

Privacy Concerns

What will happens if my face data is stolen or someone else breaks into my account?  In 2019, the Department of Homeland Security’s photos of travelers, which are
used in the agency’s facial recognition program, were stolen in a data breach.

The third customer benefit

Opt-Out Consequences

While TSA claims that facial identification scans are not mandatory, it is unclear how travelers will know that
they can “opt-out,” and what the consequences for travelers are if they choose to opt-out.

Please Know

In order to fulfill our stated mission and better serve our AJL communities, we want to get to know our AJL ecosystem better. Your responses are optional, and will be protected according to our Privacy Policy. However, your answers will support AJL with creating content that is more inclusive and will support us with strengthening our advocacy efforts.

If you are aware of algorithmic/AI harms, biases, or triumphs please fill out this form. We are committed to holding companies and institutions accountable, and safeguarding the public from AI harms.

Your Privacy

We take your data privacy seriously, and will be happy to tell you what data we have about you, and delete it if you would like. We do not sell data to anyone. If you would like to get a copy of all the data we have about you, please email contact@ajl.org with “Data Access Request” in the subject. If you would like us to delete all the information we have about you, please email contact@ajl.org with “Data Deletion Request”. You can read more in our Privacy Policy.

AJL cares about you

‍If you are considering self-harm or suicide, please leave this site and dial 988 immediately.

You believe you have been harmed by AI 

‍If you believe you’ve been harmed by Artificial Intelligence, please fill out the form. We will get back to you within 48 hours on weekdays and 72 over the weekend. 

You are seeking advice

If you are seeking legal advice or representation, consider reaching out to an ACLU office in your respective state.