
States of Biometric Exception Gendered-Colonial (In)security Under the New EU Entry-Exit System (EES)
Author Name
Christoffer Koch Andersen and Mallika Dharmaraj
Published On
January 28, 2026
Keywords/Tags
Artificial Intelligence, EU Entry-Exit System, algorithmic surveillance, colonial gender insecurity, biometric border technologies
Introduction
Around the world, nation-states are increasingly adopting biometric identification systems at borders. In a moment of rising global xenophobia and fascist nationalism, technologies such as facial recognition, retina scans, and digital fingerprints are of immense popularity as means of policing mobility and capturing bodies, now with a false veneer of ‘AI’s technological neutrality. These are not new phenomena: biometrics have long been fundamental to the way nations imagine and reify themselves, as per scholar Keith Breckenridge’s formulation of the ‘biometric state,’ drawing on a larger colonial, racist, and casteist longue duree of surveillance innovation (2014). However, the particularities of biometric capture today are worth situating more precisely within contemporary cycles of capitalist hype and promises of national security that ‘AI’ affords to ordinary ‘good’ citizens — wherein the making of one’s security and citizenship within the nation-state is always founded on the invisible displacement and unmaking of others. This development becomes especially vital to attend to with the introduction of the new, cross-national biometric European Entry-Exit System (EES) between the European nation states, which poses significant risks for gendered and racialised minorities and their mobility within Europe.
The EU Entry-Exit System (EES)
The EU’s new digital flagship Entry/Exit System (EES) for border management and immigration purposes, set to launch on October 12, 2025, mandates the submission of non-European travelers’ biometric data and allows for data-sharing between European nation-states (EU n.d.-a). The EES collects and stores: 1) travel document data; 2) date and place of each entry/exit; 3) biometric facial image and fingerprints; 4) whether you were refused entry. From this biometric data, biometric templates will be created and stored in a transnational Biometric Matching Service (sBMS). While it is stated to ‘ease travel’ and ‘modernise border management by increasing efficiency and quality of processes at the border’ (EU, n.d.-a), this biometric system alters infrastructures of movement, infringes on privacy, and results in that giving your facial image and biometric fingerprint data becomes compulsory if you want to avoid being rejected at the border. Yet, critical accounts that unpack the EES are severely underrepresented, the issues arising from the EES underreported, and the implementation of EES lacks public awareness. Among its stated goals, the EES intends to 1) “identify travellers who are using fake identities or passports” and 2) “help prevent, detect and investigate terrorist offences and other serious crimes.” (EU, n.d.-b). From this point of inception, we wish to draw attention to the surveillant dangers of the EES system and to the ways that this facial recognition data sharing may create structural vulnerabilities and exacerbate systems of discrimination for marginalised populations entering/exiting Europe, since travelers’ biometric data can be shared with Europol, international organisations, and Visa/immigration authorities between European countries. Given the sBMS allowing for transnational biometric data searching, we point urgent attention to two case studies that correspond to a minority population severely at-risk: trans people (whose biometric data often have inconsistencies that lead to increased risk of being marked as a ‘fraud’), and pro-Palestine protesters (who are increasingly criminalised within state-sponsored facial recognition technologies under new counterterrorism legislations). These two cases, we argue, are illuminating in the context of larger global techno-authoritarian attempts at policing gender and banning of pro-Palestine protests.
Situating the EES within a framework of technological surveillance, the EES forms what we might come to understand as biometric states of exception (Agamben 2005), where the constructed threat of enmity located at racialised and gendered minorities necessitates and legitimises an overarching, algorithmic surveillance to reinforce European colonial control in a datafied format. This speaks to the necropolitical significance underlying the EES system and its fabrication of an ‘othered’ gendered and racialised enemy that portray an existential outsider threat, which enhances an exceptionalist nation state logic in the face of the constructed threats of foreign invasion, security risks or terrorism sought to be contained through the believed protection leveraged by biometric data systems and their ability to fortify the European borders against unwanted threats. This threat construction, in return, justifies the overhauling towards a transnational biometric border system.
(Mis)identifying ‘Fraud’ at the Border: Trans People, Data and Passports
Trans people often have biometric data discrepancies that lead to risks of being marked as fraudulent or a threat in identification documents and algorithmic technologies (Beauchamp 2019; Costanza-Chock 2018; Hall and Clapton 2021; Keyes 2018; Shah 2023). This plays into larger algorithmic assemblages of securitised ideas around gender and algorithmic body scanners configuring trans bodies as ‘deviant threats’ from not fitting into their binary template of bodies. When the EES is legitimised through its ability to ‘identify travellers who are using fake identities or passports’ (EU, n.d.-b), it puts trans people at risk of extended violence (Andersen 2025). Historically, trans people have been slandered for being deceptive and ‘faking’ their identities (see e.g. Bettcher 2007), which, held together with discrepancies in biometric data, identification documents and algorithmic technologies, has an alarming potential to further endanger trans people at the border amidst a surge of global authoritarianism targeting ‘trans’ as an object of fear, criminalising trans bodies, and removing trans rights.
The EES promises to stop and identify deceptive travellers using fake identities (EU, n.d.-a), but how does this detection work, and who fits under this category of ‘fraudulent traveller’? The vague definition of ‘fake identities or passports’ gives leeway to arbitrarily drawn lines of defining anyone who deviates from the static notions of identity as fraudulent, and curates hostile grounds for targeting trans people as portraying ‘fake’ identities and carrying ‘fake’ passports. This renders trans people and the detected ‘fraud’ they may pose as a threat to (inter)national security, which re-legitimises the need for biometric surveillance. The EES’ intention to identify attempts at faking identities and detecting fraud in documents across borders through sBMS leaves trans people—and gendered minorities historically targeted by frameworks of surveillance—disproportionately vulnerable to biometric tracking now enacted on a transnational scale, which restricts not only their cross-national mobility, but also impinges on their safety at the border.
Policing Pro-Palestine Protest: Counter-Terrorism as a Racist Framework
A second key stated goal of the EES is to “help prevent, detect, and investigate terrorist offenses and other serious crimes.” For the entire duration of the 21st century thus far, criminality and terrorism have been crucial nodes within state lexicons across the West, produced by and productive of racial supremacy, Islamophobia, and Orientalist imaginaries of the War on Terror (Puar 2007). Now, twenty-four years after 9/11, the language of reducing serious crime and eliminating terrorism have become familiar and commonplace to the ordinary citizen-subject within the EU, as a means of guaranteeing a certain sense of national belonging and safety. Informed by decades of pop-cultural campaigns which have railed against the threat of doom that the “monstrosity of Palestine and Islam” represents, there has come to be a certain cultural buy-in, where EU denizens are already preconditioned to support national projects of defence and securitization against an amorphous-but-always-racialized enemy (p.16).
Within the last two years, these discourses have only intensified, as the scope of counter-terrorist legislation has vastly expanded in response to pro-Palestine dissent. Increasingly, state-sponsored facial recognition technologies are deployed to identify, control, and ultimately squash the political activity of pro-Palestine protestors (Borak 2024; Amnesty 2024). An absurd, yet incessant level of propaganda in the media has vilified protests for Palestinian rights as “hate marches” and falsely conflated anti-Zionism and anti-semitism in concerted campaigns of doxxing (Syal et. al. 2023). Now, EES’ intention to ‘prevent’ and ‘detect’ terrorist offenses across borders through sBMS leaves pro-Palestine protesters – and others historically targeted by racist counter-terrorism frameworks – even more vulnerable to biometric monitoring – now at a transnational scale – and, accordingly, carceral limitations on their mobility.
Conclusion
Ultimately, the EES’ concerningly broad scope illuminates the risk of cross-national algorithmically augmented surveillance systems. Given the across-EU scope of the SBMS as well as the fraught intentions of fraud detection and counter-terrorism, trans and pro-Palestine individuals remain at risk of targeting, prosecution, and forced immobility. While biometric surveillance as a means of border enforcement is far from new, the threat that automated systems now pose at an international scale is particularly draconian. This issue demands that scholars, academics, and policymakers become aware of the gruesome lived implications that such biometric systems pose for marginalised communities in terms of reworked surveillance and differential treatment of bodies along gendered and racialised lines. Most of all, it means that we must redirect our attention to the frontline movements of marginalised communities, whose wisdom and counter-organising has continued to subvert violent systems of state control, even amidst the vilest of repression.
References
- Agamben, G. (2005). State of Exception, translated by Kevin Attell. University of Chicago Press.
- Amnesty International (2024). Netherlands: Mass police surveillance of protests part of ‘growing control culture’ – new report. https://www.amnesty.org.uk/press-releases/netherlands-mass-police-surveillance-protests-part-growing-control-culture-new
- Andersen, C. K. (2025). Beyond Fairness: Trans Unliveability in European Algorithmic Assemblages. In European Workshop on Algorithmic Fairness (294): 295-302. PMLR.
- Beauchamp, T. (2019). Going Stealth: Transgender Politics and US Surveillance Practices. Duke University Press.
- Bettcher, T. M. (2007). Evil deceivers and make-believers: On transphobic violence and the politics of illusion. Hypatia, 22(3), 43-65.
- Borak, M. (2024). London police deploy facial recognition during Palestine and Israel protests. https://www.biometricupdate.com/202401/london-police-deploy-facial-recognition-during-palestine-and-israel-protests
- (n.d.-a). EES: What Is the EES? European Union. https://travel-europe.europa.eu/ees/what-is-the-ees
- (n.d.-b). EES: Data held by EES – Which data are collected by EES? European Union. https://travel-europe.europa.eu/ees/data-held-by-ees#which-data-are-collected-by-ees
- Hall, L. B., & Clapton, W. (2021). Programming the machine: gender, race, sexuality, AI, and the construction of credibility and deceit at the border. Internet Policy Review, 10(4), 1-23.
- Keyes, O. (2018). The misgendering machines: Trans/HCI implications of automatic gender recognition. Proceedings of the ACM on human-computer interaction, 2(CSCW), 1-22.
- Puar, J. (2007). Terrorist Assemblages: Homonationalism in Queer Times. Duke University Press.
- Shah, N. (2023). I spy, with my little AI: how queer bodies are made dirty for digital technologies to claim cleanness. In Queer Reflections on AI (pp. 57-72). Routledge.
- Syal, R., Sabbagh, D., and Stacey, K. (2023). Suella Braverman calls pro-Palestine demos ‘hate marches’ https://www.theguardian.com/politics/2023/oct/30/uk-ministers-cobra-meeting-terrorism-threat-israel-hamas-conflict-suella-braverman
- Gloria Mendoza / https://betterimagesofai.org / https://creativecommons.org/licenses/by/4.0/
