(GUEST) Abolishing Carceral Embodiments of Facial Recognition
We don’t need technology that expands the police state, we need abolition.
In NAZAR’s first guest post, organizer and researcher Priya Prabhakar unpacks the recent performances of big tech companies pledging to restrict or eliminate distribution of their facial recognition technologies.
📝 Monthly Round-Up
Black Skin, White Masks: Fratnz Fanon’s Black Skin, White Masks, is referenced in this piece and can be found online.
Interview with Simone Browne: Simone Browne is the author of Dark Matters: On the Surveillance of Blackness, a book that traces the historical lineage of the surveillance of Blackness, noting that it is not a new paradigm. In this interview with Simone Browne, she talks about biometrics in relation to Blackness, and the conundrum between racialized surveillance and the potential liberatory nature of being “unseen” by technologies.
Feminist Data ManifestNo: Drawing upon the field of feminist data studies, the Feminist Data ManifestNo is written by feminist scholars, influenced by Black, queer, trans- and Indigenous feminist thinkers who oppose harmful data regimes and commits to radical futures of technological collectivism.
Last month, three of the largest multinational technology corporations in the world pledged to restrict or eliminate distribution of their respective facial recognition technologies. It began with IBM announcing it would stop offering, developing, and researching facial recognition, followed by Amazon declaring a one-year moratorium on selling its facial recognition technology (Rekognition) to police, and Microsoft saying it would stop selling facial recognition technologies to police until legislation is put in place.
Activists were quick to see through the corporate onslaught of performative announcements. As campaign strategies director for MediaJustice Myaisha Hayes explained: "We have no doubt that Amazon’s announcement is no more than a political stunt meant to quell widespread momentum and demands for the corporation to stop profiting from our oppression and cut ties with all law enforcement agencies." While these companies will no doubt continue their song-and-dance, it’s important to contextualize the social role of facial recognition, and biometric surveillance as a whole, through examining its carceral roots and perpetuation of racial capitalism.
Facial recognition is one of the most powerful forms of biometric surveillance, and in its current state, it is fundamentally defined through neoliberalism and state-sanctioned violence. One of the most dangerous uses of facial recognition is the way it serves the carceral state to further forms of punishment, especially by the police. The machine learning of facial recognition has essentially codified a deeply racist physiognomy, propagated under the violence of perceived neutrality and objectivity. It is no surprise that some of the first forms of facial recognition were fed to front companies for the Central Intelligence Agency (CIA) to further their imperialist agenda, and soon moved on to serve law enforcement agencies and police departments for the expansion of the prison industrial complex.
Mainstream critiques of facial recognition tend to center algorithmic gendered and racial biases embedded within these systems, based on both first-hand accounts and research. Researcher Joy Buolamwini, founder of the Algorithmic Justice League, found rampant algorithmic bias in all three aforementioned corporations’ facial recognition technologies, which exhibited a 34% error rate in identifying dark-skinned women. Terrifyingly, the facial recognition software only recognized Buolamwini when she put on a white face mask, quite literally materializing Fanon’s metaphor of “Black Skin, White Masks”, in which he explains the formations of the colonized Black psyche that is subjected to constant violence.
The problem, however, is not solely that biometrics consistently fails in identifying people accurately. Biometric surveillance, along with other forms of technosolutionist surveillance mechanisms, assumes a scientific objectivity that distances itself from social processes in which they are embedded. Instead of relying on individuals to identify themselves as a form of agency and autonomy, the algorithm has taken on the responsibility of verifying identity. Biometrics relies on essentialized features of one's body to provide a more accurate form of identification. Coming from a presumed objectivity of the algorithm, paired with the dangerous misconception that bodily features are essentialized and static, facial recognition becomes materialized through flows of interactions of systems of domination.
To understand how such systems function through racial capitalism, it is helpful to take Stuart Hall’s definition of race as a floating signifier and discursive category that “gains their meaning not because of what they contain in their essence, but in the shifting relations of difference which they establish with other concepts and ideas in a signifying field. Their meaning, because it is relational and not essential, can never be finally fixed, but is subject to the constant process of redefinition and appropriation.” Facial recognition and other forms of biometric surveillance essentialize and reinforce the technological basis for race, even more so when it is in the hands of the carceral state. It represents what Fanon called the digital epidermalization of identity; the oppressor is coding corporealization into data, and hiding it under the guise of neutrality. The only way to be recognized, both literally and metaphorically, is by wearing a white mask.
Black feminists have paved the way for my, and many others’, abolitionist framework of the world. As Mariame Kaba, Ruth Wilson Gilmore, Angela Davis, and many other abolitionists have laid out: you cannot reform a system that was created to oppress. If facial recognition is rooted in a system of carcerality, we cannot rely on these forms of technology to liberate us, even if they are “reformed”. As abolition becomes a more tangible future for so many people during this crucial time of political struggle, we must also oppose the development of technologies that purport to “hold police accountable”. We don’t need technology that expands the police state, we need abolition. The problem is not only that these facial recognition technologies are less accurate at identifying Black faces, it is that corporations and the police state have any capability and capacity to identify at all, given that they are institutions created to surveil and perpetuate anti-Black violence.
IBM, Microsoft, and Amazon halting their facial recognition technologies should not invite praise or accolade. They are PR tools from billion dollar companies that have no real commitment to the struggle for liberation, and were only achieved through the tireless and sustained activism that Black activists have been undertaking for centuries. Corporate commitment to liberation is the abolition of the company itself and the expropriation and redistribution of their capital. After all, surveillance will always be perpetuated as long as we exist under the capitalist and imperialist elite and their dependence on carcerality.
Tech companies continue to develop facial recognition technologies that utilize the logic of carcerality and capital accumulation. Regimes of surveillance inherit the systems they are under, specifically racial capitalism and punitive justice. Liberation can only come through the abolition and destruction of systems of domination that use technology and surveillance to their ends. Technology should be harnessed by the global majority and should begin with creating anti-surveillance forms of software and hardware that refute its neutrality and proudly embrace a politic of collectivism.
📌 Organizing
Mask On Zone — Some comrades and I launched an anti-surveillance toolkit to protect protesters on the streets resisting white inferiority and the carceral state; and to protect them after returning home and when talking about uprisings. Our purpose is to inform people on what to consider before, during and after protesting, build political education around anti-surveillance and digital safety, and create a hub for anti-surveillance resources and tips in simple plain text that can be easily digested. We seek to develop a practice of anti-surveillance which finds necessary and creative ways to subvert the surveillance state. As we fight for a new world, we also must make political demands that promote trust, solidarity, mutual aid, and collective forms of living, that render surveillance obsolete. Find us at @maskonzone on Twitter and Instagram as well.
Hacking and Hustling: Hacking and Hustling is a group of sex workers and allies working at the intersection of social justice and technology. Derived from the knowledge of sex workers, they have been extremely useful in providing an anti-surveillance praxis for protesters currently on the street.
Survived and Punished: Survived and Punished in a survivor-based abolitionist group co-founded Mariame Kaba, who is referenced in the piece above. Their work specifically focuses on criminalized survivors and regularly engage in mutual aid projects that support those incarcerated.
Priya Prabhakar is an organizer, designer, researcher, and filmmaker from Chennai, India. She can be contacted through email, priyaprabhakar32@gmail.com, or through Twitter: @priyaprabhakar.