The Hegemony Of The Vision Machine
“Power is only the parody of the signs of power – just as war is only the parody of signs of war, including technology.”— Jean Baudrillard
ALEXANDER W. SCHINDLER
studies at Berlin University of the Arts and works for the Vilém Flusser Archive Berlin. Originally media designer and art director, he now focuses on media theory, media arts and philosophy. His theoretical and artistic fields are the photogrammetric image and imaging in the post-photographic era.
In 1946, the first postwar Macy Conference took place under the guise of advancing control and communications scholarship, and the conferences that followed helped usher in a new epistemic direction for the discipline of information theory. Claude E. Shannon’s landmark 1948 paper “A Mathematical Theory of Communication” played a decisive role on the methodological direction of the conferences. This meant that the conferences not only aimed to promote ideas and concepts for the betterment of the human condition by revealing new truths about our neurophysiological systems and the biological genetics of the human brain in general, but also to detect exactly how human beings behave; how they act, and how they make decisions. This redefined scope helped push the conferences in such a profound direction that they might fundamentally reform the basis of scientific theory.
As the scientists attending the conferences were eminent thinkers in a myriad of fields, it was initially hard for them to converse, so they therefore decided to choose a topic that was nobody‘s speciality, but still of interest to all present. Their respective eminences were important for another reason: all of the scientists had nothing to prove. They eventually decided to discuss the nature of control, and expanded upon topics such as self-regulating and teleological mechanisms; simulated neural networks emulating the calculus of propositional logic; and anthropology and how computers might learn how to learn. The conferences’ influence on the modern idea of postwar societal systems steadily increased. In his 1948 book “Cybernetics: Or Control and Communication in the Animal and the Machine”, Norbert Wiener, one of the most influential conference members, reactivated the Greek term “kybernētēs”, or the “art of steering”, a term originally used by Plato to refer to his concept of government. Kybernētēs became cybernetics, and he defined it as the science of observed systems. The universal science of cybernetics was thus born. In actual fact, the basic systemic mechanism of the idea behind cybernetics is unbelievably simple: it mainly revolves around the idea of feedback.
Feedback and the algorithms accruing therefrom were essential for Wiener’s approach as they connected control (actions taken in hope of achieving goals) with communication (connection and information flow between the actor and the environment). This simplicity testifies that cybernetics can only be understood as applied epistemology, as it tries to simplify structurally complex systems, such as living organisms, abstract intelligent processes, or language, into simple functions. The influence of cybernetics on various political, scientific, economic and technological sectors has since been enormous, and still today continues to play a role in fields where the complexity of systems get out of hand.
With the beginning of the Cold War and the growing divide between capitalism and communism, the idea of a fully controlled state became increasingly seductive for political leaders. For the first time in history, a subliminal mechanism had been founded to simultaneously inform and control a complex system. The principle of feedback seemed to know no limits. The nuclear crisis between the US and the USSR became a conflict based on the cybernetic aim to reach a desired condition based on monitoring and calculation. At the beginning of the age of nuclear warfare, there wasn’t any room left for heroism and bravery; conflicts were now based on the orientation of one’s opponent. The antagonist became the only reference point and was at centre stage of its own actions.
In 1971, the Chilean president Salvador Allende commissioned theorist Stafford Beer to initiate Project Cybersyn (“cybernetic synergy”), the goal of which was to achieve management of the domestic economy through a distributed decision support system. This would be achieved by installing a national network of telex machines linked to one mainframe computer located in the so-called ‘Opsroom’, a command centre designed similarly to the fictional spacecrafts in Star Trek. Sadly, Allende committed suicide after a military putsch in September 1973, and Cybersyn was never completed. Although the technological capabilities were still relatively crude at this time, sensory and networked systems continued to spread, even though home-based computers were still a thing of the future. Technological and the political apparatuses continued to coalesce, and the principle of alienation blossomed as cybernetics increasingly ingrained itself into various parts of society.
Concerning alienation in context of cybernetics, we turn to the two major ideas of Jean Baudrillard: “The first one, critical, was that reality has disappeared and was replaced by simulacra; the second one, more agonistic, was to turn this disappearance into a symbolic challenge.” When Baudrillard discovered the Silicon Valley for the first time, he called it the “cybernetic disintegration of the ‘tertiary metropolis’.” By registering the California Effect, Baudrillard realized that production was moving into consumption and that this applied everywhere. When he outlined his thoughts on consumption in a globalized world and its influence on the societal system, he distinguished carefully between domination and hegemony:
‘HEGEMON’ means the one who commands, orders, leads and governs (and not the one who dominates and exploits). This brings us back to the literal meaning of the word ‘cybernetic’ (Kubernetikè, the art of governing). Contrary to domination, an hegemony of world power is no longer a dual, personal or real form of domination, but domination of networks, of calculation and integral exchange.
What Baudrillard observes here is a historical shift in empowerment. In a society where everyone’s core function is consumption instead of production, oppressive forms of dominance can’t be a suitable strategy to rule over a globalized social fabric. Hegemony in Baudrillard’s sense isn’t – in contrast to the colloquial understanding of the term – a normative notion. For him, hegemony goes back to its abstract origin: the concept of ‘leadership’. But leadership doesn‘t necessarily mean oppression. A leader in a cybernetic society might better be considered a steersman who no longer controls the direction in which the global ship sails, but instead constructs (or allows construction of) the infrastructure of information and controlling mechanisms, both of which might be used to let the ship be automated. If we imagine such a passive, political exertion of influence in our society, we’ll abruptly find ourselves in front of a new level of alienation – a magnitude of alienation nowhere near what Karl Marx could have imagined in his materialist conception of history. It’s a society that is losing all its moorings. That’s the realization that Baudrillard reached when he turned de Saussure’s discovery of linguistic value (based on signs as pure differences) into the corpus of a ‘structural revolution’ based on the identification of a code which is independent of all outside reference, and detached from the symbolic world: an irreversible political upheaval was inevitable.
With the unavoidable phenomenon of over-consumption in a world of finite resources, it is a logical consequence that something different beside the tangible goods of consumption might be produced. The influence of Silicon Valley, as Baudrillard recognized, isn‘t really based on the innovational ideas of technological solutions for the originary problems of humankind. In actual fact, what I term the ‘Silicon Valley ideal’ defines new kinds of quasi-objects for consumption which are based on a technological infrastructure made for abstract communication. The infrastructure therefore reaches a new type of materialistic structure. The texture of the materials used for digital gadgets become smart, meaning that the materials recognize feedback at all conceivable and inconceivable levels: location, temperature, light, audio, pressure, motion etc.. Ultimate sensorial feedback machines are thus created: The cybernetic dream of Norbert Wiener is nearly coming true.
However, the leverage effect of the Silicon Valley ideal in terms of consumption is not only interesting because of the materialistic technological infrastructure, but also the consequential possibilities for the “homo consumens”. As the devices surrounding us are all turned into gadgets of information storage and transfer, all technological outcomes of the Silicon Valley Ideal might be invariably seen as technologies of media – the largest monopoly of media ever created. The consequence of a media monopoly of these ‘digital signal technology’-based quasi-objects is that the distribution of information no longer underlies any materialistic restrictions. Both theories of Baudrillard came true: firstly, the emergence of an alienated society which spends its existence in a hyperreality due to the unleashing of simulations, i.e. simulacra in society. And secondly, the idea of the ‘homo consumens’ who satisfies its needs – or let’s say constraints instead of needs – mainly through the consumption of media in a belief that materiality has no large influence on it.
To perceive the current societal situation we inhabit, it is important to keep in mind that both the cybernetic political and technological apparatuses must always be treated together as they constitute a quasi-object. When isolated, they might be considered as two coexisting entities, but as Latour indicates in a condition of translation from one into the other they appear as an interconnected third – they insist equality, because one entity needs the other to operate. This conflation based on feedback represents the ‘action programme’ (another term for quasi-object) of our societal system. In his theory of the history of media culture, the digital thinker Vilém Flusser terms ‘programme’ synonym for ‘faith’. As the improbable function of human communication within a social fabric is forced to believe in any similar modus and the superiority of the natural sciences replaced theistic belief, Flusser exhibits that our society is still at the last step of the ‘game of abstraction’: the zero-dimensionality of numbers and algorithms. The alphanumeric ‘programme’ has replaced faith in the myth.
This wide exemplification should help us to contextualize specific sensoric and interpretative technological aspects of cybernetic systems. Currently, the most emerging and influencing technological progress might be – in Paul Virilio’s words – the vision machine. The idea of emulating the human visual perception and thereby creating the basis for a machine that has the potential to learn by itself to interpret objects and constellations is no longer just a utopian idea. Virilio’s analysis of computer-aided perception using lens-based devices was written in 1988. Unsurprisingly for the ‘philosopher of speed’ and inventor of dromology, the most important factor in its development was the dimension of ‘time’. More precisely the shutter time of the camera which has the potential to overcome the possibilities of the human eye. For him, this became the crucial factor behind machine vision’s exponential growth in fields like industrial production, military, surveillance, robotics, etc..
Paul Klee’s assertion that “now objects perceive me” might have had an essential influence on Virilio whose astute speculation about the future of our computerized visual culture and the unstoppable advance of machine learning also included that the evolution of the vision machine has to be understood in the context of the concept of artificial neuronal networks: the perceptron. As expected, the basic idea behind the perceptron – ‘neurons’ as threshold value elements, which could in combination express every Boolean function (AND, OR, NOT) – was invented by Warren McCulloch and Walter Pitts, important participants of the Macy conferences.
Although Virilio’s substructures, summarized as the factor of time and the connection to Frank Rosenblatt’s perceptron-experiments, seem to be elaborated for the future of the vision machine, he might have made a paralogism. For whatever reason, he didn’t transfer his thoughts about the organisation of time and space in the context of cognition, recognition and indication as a gesture, which is constituted by both the prior word and mental images into the corpus of the vision machine.
Space, or spatial imaging, became the most important factor in the machine vision process. Under the historical term of ‘photogrammetry’, the roots of which are nearly as old as the invention of photography itself, we step into the so-often sloppily proclaimed post-photographic era. Photogrammetry is the science of making measurements from photographs, especially for recovering the exact positions of surface points. That means that the new form of image capturing and video recordings not only measure light concerning colors but also register the metric and spatial data of the captured constellation of objects in space. Through this new technology, the machine can interpret objects via their shape through multiscopic vision. In short, the current photo apparatus turns software solutions, processing power and new forms of storage capacity into a three-dimensional scanning device.
As Virilio recognized, just like the capacity of the human being to think in three dimensions, the ability of stereoscopic vision and the variation of the point of view, the vision machine also gains the possibilities for object recognition on all scales. This improvement opens a new chapter for the idea of artificial intelligence. The previous focus on image interpretation, based on the analysis via histograms, didn’t allow recognition of an image’s content based on form and ratio of objects.
The fact that visual perception and interpretation isn’t only highly subjective and influenced through our cultural background settles a contradictional basis for the codified input commands given from the human operator to the computerized machine. Like Wittgenstein recognized in his “Philosophical Investigations”, language, as with programmed code, is always something private, something that must be interpreted by the one who speaks it out or writes it down. Therefore, the growing of ‘intelligence’ always needs to collate already-interpreted information via language (private language) and the pureness of the visual world.
As the development of artificial intelligence up until now mostly began in the middle of the development of human intelligence (by language), there hasn’t been any possibility to let intelligence grow ab initio as a human baby does when it learns to conquer the boundaries of language based on education and visual-based subjective interpretation. Of course, the visual interpretation lies in deep connection with a language-based semantic interpretation of the world. Nevertheless, to elaborate intelligence, it seems to be necessary to deal with the consequential contradictions of both categories. This contradiction, the inner fight between learned symbols and pure visual impressions – as well as all the other sensory perceptions – might indeed be a mechanism of intelligence itself.
Circling back to Baudrillard’s understanding of hegemony of world power as a combination of networks, calculation and integral exchange, the vision machine must also be understood as a divorced entity and not as a quasi-object. With regard to the upcoming flood of CCTVs, drone wars, biometric detection and image recognition e.g. in social media channels we’re tempted to see the vision machine as an instrument of dominance. But this is in fact a paralogism. The machine that learns both how to look and equalize the visual world with computer-based language is just a further step of the cybernetics-based evolution of artificial intelligence. A robot isn’t genealogically a creature of dominance; it is a result of the human urge for technical advance in order to reduce the complexity of systems through the mechanisms of hegemony.
When something might be changed in the genealogy of the here-outlined affairs, we might think about the utopian phrase of Baudrillard when he wrote that “power itself must be abolished – and not solely because of a refusal to be dominated, which is at the heart of all traditional struggles – but also, just as violently, in the refusal to dominate.”
 Claude E. Shannon | A Mathematical Theory of Communication | 1948 Beer Stafford | ASC Foundations: History of Cybernetics Norbert Wiener | Cybernetics: Or Control and Communication in the Animal and the Machine | 1948 The telex network is a switched network of teleprinters similar to a telephone network, for the purposes of sending text-based messages. Point-to-point teleprinter systems had been in use long before telex exchanges were formed starting in the 1930s.[5-6] Sylvère Lotringer | Introduction of The Agony of Power | 2010 Jean Baudrillard | The Agony of Power | 2010 Jan Mukarovsky | On Poetic Language in The Word and Verbal Art | 1977 With the term “quasi-object” Bruno Latour marked the conjunction of humans and technology. The two entities are influencing each other equivalently and form a third entity – an action programme. For details: Bruno Latour | We Have Never Been Modern | 1993 Baudrillard doesn’t use the term “homo consumens” to describe his conception of man. The term is as used here is borrowed from Erich Fromm in the context of his interpretation of alienation as an effect of market repression. For further details: Erich Fromm | To Have or to Be? | 1976 Virilio can be considered as a theorist of dromology, i.e. “speed as the engine of destruction.” For him, speed is the hidden aspect of power and wealth ruling our society. In his extensive writings on politics and media his ideas circulate mostly around the dimension of time. The “science (or logic) of speed” Paul Virilio | The Vision Machine | 1994 Via Frank Rosenblatt’s idea of machine learning, the perceptron is an algorithm for supervised learning of binary classifiers (functions that can decide whether an input, represented by a vector of numbers, belongs to some specific class or not). Paul Virilio | The Vision Machine | 1994 Karl Kraus | Photogrammetrie. Geometrische Informationen aus Photographien und Laserscanaufnahmen | 2004 Jean Baudrillard | The Agony of Power | 2010