Fabio Lattanzi Antinori
An image that is created by Artificial Intelligence (AI) might convince the last sceptic that we have completed the transition into a post-representational era, in which the abilities of (non)-human cognition and perception are configuring anew. Not only has it become harder to distinguish the real from the constructed world in which objects, artefacts, climate change, capital or nation-states inhabit this planet. (A planet that is sometimes, thankfully, also populated by non-hierarchically constructions that question and disturb the powerfully defined (b)orders of western Eurocentric thinking by offering alternative points of departure such as Angela Davis thoughts around solidarity and empathy or Donna Haraway’s notion of situated knowledges and tentacular thinking.) But at the same time, we have entered an era in which we literally can make anyone say anything. Just think of the photorealistic video of Barack ObamaSee Buzzfeed Video | You Won’t Believe What Obama Says In This Video! | Other AI again fails to detect these “Deepfakes” as fake. See for example Motherboard | There Is No Tech Solution to Deepfakes Funding technological solutions to algorithmically-generated fake videos only puts a bandage on the deeper issues of consent and media literacy. spreading out fake news that was entirely programmed with Adobe After Effects and the AI face-swapping tool FakeApp. It would be a nightmare to think of this as the arrival at the outmost boarder of language, a place in which de Saussure‘s “Langue and Parole” would not be distinguishable anymore but would have subsided into each other. But where did the good old-fashioned image disappear to in a time in which anyone can also make any kind of images?
How has it gone to work in a factory, employed in the operative logistics department of the “Informatic Flows Inc.” specialised in capturing, editing, detecting, managing and forecasting (the behaviour of) identities and populations? Have we really entered a time in which the image, just as the Internet,Hito Steyerl | Too much world: Is the Internet dead? | In: The Internet Does Not Exist | e-flux journal | Sternberg Press | 2015 has become undead, constantly dying and being reanimated by selfies, algorithms and metadata? The time this possibly happened was when John Daugman, Professor of Computer Vision and Pattern Recognition at Cambridge University had automatic iris recognition patented in the early 1990s and opened his invention up for commercial global implementation. This helped, for instance, India to biometrically enrol the eyes of almost all the people on its territory (together with their fingerprints) in a national ID and welfare entitlements distribution system. Let us assume this was the time, the images (of eyes and faces) started to serve as a template entrance gate to a new form of automated control and value extraction that exploits measured body parts like your eyes, face, fingerprint or palm. Should the question not rather be: To which site has this (biometric) image taken our body parts and how can we be able to see ourselves properly then, without eyes?
Biometric devices are machines that automatically create patterns of the iris. Eyeguard is one of them. This device looks more like a mask you would wear when participating in a masked ball except that the holes for the eyes are not narrow slots. In fact, they are much bigger, even leaving space for eyebrows. Along this long hole, some tiny orange lamps are mounted, probably with the intention to indicate a diligent functioning, but they are rather reminiscent of a Star Wars gadget. In the advertisement brochure, Eyeguard is presented with a white-skinned woman, her eyes are rouged. Mostly installed on a tripod to minimise shaky photos while your eyes are scanned, “the world most accurate dual-eye iris camera system is designed with a large number of people to be enrolled”.See Iris Guard’s Eyehood. This machine, that is also a mask that hides the backside of its violent logistics and algorithms designed around its screens. Let’s imagine that the tidily rouged eyes of this woman are replaced by two images: The pattern of a fingerprint, the pattern of a scanned Iris, both behind the scanner, both organised as encoded bodies. The images are operative, programmable in the sense that they function as a deterritorialised factory to generate profit. More precisely, the face and the eye have become nodes in a global network, worksites of precarious informatic labor.
Eyeguard is produced by Iris Guard.Iris Guard Inc. is registered on the Cayman Islands and has recently signed an investor deal with Goldman Sachs. The company cooperates with the UN Refugee Agency. It delivers the algorithm, the interface and the hardware to perform iris recognition in refugee camps. Since 2013, all people arriving in the Jordanian Zaatari or Azraq camp must register their irises. With the new digital (and perfectly neoliberal) strategy of the UN to implement biometric registration in their camps worldwide, they have installed around 300 registration sites worldwide and scanned more than 2,4 million refugees in their public-private partnership with Iris Guard and other vendors of biometric scanners. Produced in an automated way, the barcode is saved in a database called Eyebank and also uploaded to a cloud server. From that moment, the refugee’s identity can be automatically recognised, from any location in the world that houses biometric machines. Even remotely, at a checkpoint or at the airport for instance, and often without the individuals knowledge. The UNHCR’s database can potentially track, tag, monitor and predict not only their consumer behaviour but also their movement. The mode of data mining is compulsory since receivingThe case of the Rohingyas enrolment in Bangladesh is but one example of the perils UNHCR’s Biometric Identity Management System imposes on vulnerable populations. See:
Zara Rahman | Irresponsible data? The risks of registering the Rohingya. ID cards have brought little but pain to the Rohingya. And this time they’re biometrich | IRIN NEWS | 23 October 2017
Elise Thomas | Tagged, tracked and in danger: how the Rohingya got caught in the UN’s risky biometric database | WIRED | 12 March 2018 food and relief aid is in large parts distributed through cash-based assistance: their scanned irises now replace cash or bankcards. The barcode is an operative image, making sense only to machines. The stored information behind its object recognition taxonomy is inaccessible to the individuals.
The humanitarian rationale of urgency, something must be done, allows UNHCR to conduct political, medical and policing tests. The way it organises and commercialises its camps illustrates how it manages the undesirables.
Historically, new technologies and experiments were always tested and carried out on minorities or on groups perceived as inferior. Camps serve as political-juridical grey areas, characterised by extraterritoriality, regimes of exception and marginalisation. Continuities and connections between the politics of biometrics today and colonial pseudo-scientific methods of measurement such as Galton’s fingerprints or the Fowler family’s phrenology become visible as measurement genealogies imbedded in “the racial calculus and the political arithmetic that were entrenched centuries ago”, as Saidiya Hartma shows.Saidiya Hartman | Lose Your Mother: A Journey Along the Atlantic Slave Route | Farrar Straus and Giroux | 2008
The colonial and imperialist episteme that invented human races to legitimize slavery and exploitation still shape biopolitical discourses today, even though under changed signs of efficiency, innovation and the logic of ever-expanding markets. The bodies of the global peripheries become experimental, precarious populations in huge Labcamps in which statistic, algorithmic and biometric technologies choreograph their performance. The precariousness of living in the camp, the control via algorithmic governance, the exploitation of refugees in the Labcamp as objects of experimentation and as producers of informatic labour turn the camp into a site of production, application and absorption of globalised computational capital.Jonathan Beller | The Message Is Murder: Substrates of Computational Capital | Pluto Press | 2018
Jonathan Beller | Informatic Labor in the Age of Computational Capital | In: Lateral, Journal of the Cultural Studies Association | 5.1 | 2016 On all these levels a new form of work is in progress, a work that generates surplus value from scarce resources. The image is part of logistics that create surplus-value of scarce resources. It has hijacked the eyes and bodies of the people. This image is also greedy: It uses the .jpg of the scanned eyes as a springboard, the human iris as a datasource. The image pixel used to be the smallest unity of the digital image. Now the pixel has become a logistical doorway, a checkpoint (among so many others) in the networked ever-expanding flows of information.
A checkpoint that is not a pixel, that is not a point anymore but a border crossing that decides which bodies and which information are allowed to pass and which are not. It is the pattern (recognition) that makes your eye unique and makes you being traceable around the world. The (algorithmic) image has made itself complicit to these logistics of precarious value creation. The algorithmic image is the raw material for the factory in the metadata society, a factory in which the unwilling, immaterial labor feeds a political economy in the camp that is one of automating humanitarian endeavours and turning refugee aid into a business structure. Gates, checkpoints, pixels are the logistical sites of a new epistemic space that is the eye of the algorithmMatteo Pasquinelli | The Eye of the Algorithm: Cognitive Anthropocene and the Making of the World Brain.. It leaves us left to our own devicesHito Steyerl | Left To Our Own Devices | Monoskop | 2015 (if we can afford them!) and our bodies while the eye of the algorithm continues to mathematize and normalize our moves effortlessly,Just as in the case of Afghan civilians that were acting “abnormal” in the logic of the algorithms. They were falsely categorised as terrorists and subsequently killed by drone strikes, an information we have thanks to Edward Snowden. See SKYNET | Courier Detection Via Machine Learning rendering bodies as mobile checkpoints and sources of capitalist creation. As the image has been reduced to a mere functionary of the automated factory: How can we resurrect the .jpg file as something other than a functionary that is complicit of algorithmic exploitation? How can we repurpose the image and stop the cybernetic nightmare of networks that normalize, modulate and perform futures of whole populations through pattern recognition,Populations that were beforehand carefully constructed, such as, in this Essay, a “refugee population” is set up by UNHCR. The figure of the “refugee” identifies humans with the “flight”. The datafication of refugees turns them into digitized datasets. We have to be aware of what this means: It is a practice that works with statistical modelling techniques. It involves quantification, classification, and the construction of individuals and of populations that can be managed. The constructed categories are never impartial or objective but embedded in specific, local and differing socio-political contexts. Humans are being made refugees by external events. It is not a “natural” or a biological fact. metadata and bursting databanks?Today, we are also facing the merging of databanks into metadatabanks or a centralized storage of information, which enables an even more intrusive rendering of lives open to intervention. The 12 digit number for every biometrically registered person in India, called Aadhaar is a recent example. This number basically can be used for all kinds of third parties such as banks or tech companies such as Amazon, Facebook, Google or Microsoft.
“As Aadhaar identification became integrated into other systems like banking, cell phones and government programs, tech companies can use the program to cross-reference their datasets against other databases and assemble a far more detailed and intrusive picture of Indians’ lives. That would allow them, for example, to better target products or advertising to the vast Indian population. ‘You can take a unique identifying number and use it to find data in different sectors,’ explained Pam Dixon, executive director of the World Privacy Forum, an American public interest research group. ‘That number can be cross-walked across all the different parts of their life’.”
Paul Blumenthal and Gopal Sathe | India’s Biometric Database Is Creating A Perfect Surveillance State—And U.S. Tech Companies Are On Board | Huffington Post | 25 August 2018
One strategy is to re-tie, as I’ve tried to, this technology that pretends to have no historyClaude Shannon, the famous founding father of information theory popularised the modern meaning of information as a pure carrier, detached from sense and significance or origin has thus not surprisingly much in common with technology‘s apologists of today.
Sandy Pentland from Massachusetts Institute of Technology (MIT) believes for instance that big data and pattern recognition digs up the “social physic”-as-law. “Social physics uses mathematic models and machine learning to understand and predict crowd behaviors.” | MIT News of December 19 | 2017
Ironically, all these claims constantly create worlds of buggy beta-versions, failed targets, civilian life as abnormal, stupid AI‘s, misrecognized objects, apophenia that indicates nothing but the self-reflexivity and non-intelligence of technology‘s tools like smart backpropagation. Beyond that is feed to a neural network it (still) cannot recognise.
Matteo Pasquinelli | Anomaly Detection: The Mathematization of the Abnormal in the Metadata Society | 2015
Kate Crawford and Hito Steyerl | Data Streams | The New Inquiry | 23 January 2017 with its violent colonial pasts and capitalised present. Colonies served as fantastic projection screens onto which Europe mapped its dreams and fears, as Nikita Dhawan has said recently at the Akademie der Künste in Berlin at the opening of the Symposium Colonial Repercussions “Planetary Utopias – Hope, Desire, Imaginaries in a Post-Colonial World”: Not only were the colonies a huge and fantastic projection screen, but also they functioned as laboratories of European hopes and imaginationsSee the third Symposium of “Colonial Repercussions” called “Planetary Utopias – Hope, Desire, Imaginaries in a Post-Colonial World. Reimagining postcolonial futures, requires a move beyond the belief that undoing European colonialism would be sufficient”.. A possible question might be then: How can we break the ahistoric informatic screens in the laboratories and camps of this world that take away the eyes and parts of the bodies today as we‘re governed by opaque oceans of data?