There are two features of the contemporary world that we have probably all noticed: the first is that the pace with which technological innovations are introduced seems to be accelerating each year and this can be gauged by the proliferation of technologies that have become part and parcel of everyday life. The second feature is that the technologies that we utilise in one year need to be replaced completely or upgraded within a relatively short period of time. The new technologies have a life span too: the floppy disks and video players that were so important to our lives fifteen to twenty years ago are no longer manufactured. From these features it is evident that the worlds of the late 20th and 21st centuries are different qualitatively from anything that came before them. The giant strides in technology have become a defining feature of the world, affecting many differing facets of our lives, from social, to environmental, educational and medical.
What is the nature of this world? Why is it so different from earlier periods of civilisation? The answer lies clearly in the growth of technology and its permeation in all aspects of life. This claim needs to be qualified since the role of technology in our lives is not just one of utility – as this could be said of all technologies (tools, mechanical clocks, etc) – but rather because a new and different kind of existence has been created; in other words, as a result of the interaction between humans and machines we are now living in a new world order.
Not all have been happy with the way technology has infiltrated our lives. Dystopian narratives about technology taking over and enslaving humanity abound in the world of science fiction and popular culture. But even beyond the world of science fiction and popular culture, within the world of theory, a certain worry about technology can be detected in, for example, Martin Heidegger’s essay ‘The Question Concerning Technology’. In this essay, Heidegger distinguishes between a pre-modern view of the relationship between technology, humanity and nature as a ‘collaborative’ and harmonious process and the modern view of technology as dislocated from both humanity and nature since it is now conceptualised instrumentally with the sole function of serving humanity. While not dismissive of modern technology, he argues that the mindset of the modern instrumental view of technology treats nature only as a resource for human exploitation. The problem with this mindset is that it tends to exclude other ways of relating to nature or for that matter, other humans, who become themselves objects of exploitation. He suggests that we should therefore re-think our relationship to technology and to realise that its modern instrumental approach is only one out of alternative possibilities.
In line with this suggestion to re-think the relationship between humanity and technology, the point of departure for an examination of this relationship should first attempt to understand human existence itself. In 1980, the Rockefeller Commission on the Humanities, offered a description of the humanities:
Through the humanities we reflect on the fundamental question: What does it mean to be human? The humanities offer clues but never a complete answer. They reveal how people have tried to make moral, spiritual, and intellectual sense of a world in which irrationality, despair, loneliness, and death are as conspicuous as birth, friendship, hope, and reason. (1980:1) So the question is this: if the humanities (the arts) have given voice to that which is exclusively human, now that – as some posthumanists claim – the ‘exclusively human’ is no longer a tenable category, is there any place left for the humanities in today’s technocultured world?
By way of answering this question, I am going to focus on the philosophical question raised in the aforementioned citation, namely, ‘What does it mean to be human?’ This is clearly a philosophical question because it is asking us to define, i.e., to find that unique feature that sets humans apart from other things. Philosophers enlist all the available knowledge – from biology to anthropology to neuroscience – to answer this question and the question we ask today is the same question that has been asked since antiquity, since the availability of new knowledge from various fields reveal that ‘a complete answer’ can never be provided.
However, there is a difference between the traditional approach to defining the human as exemplified in the Rockefeller Commission Report and other approaches characterising the technocultured world. While the traditional search for the human was grounded in nature, the technocultured worldview rejects the idea that there is an essential or defining feature of humanity. In other words, while, for example, Aristotle would define mankind as a rational animal, i.e. as a creature (like other creatures) who by nature was ‘given’ something special, namely the faculty of reason, for Foucault, what counts as human is not something biologically given by nature, but a construct produced as a result of institutional and political discourses, constructions that are situated historically, and that therefore can, by implication, be changed. This is an important point because when people talk about something as natural, they are implying that it is unchangeable. In other words, it is the way things are, and from this state of nature, they go on to deduce a set of norms, i.e., how things should be. A recent case in point, for example, is the situation of women who, for a long time, were considered (by men) as weak and emotional (by nature), so it was only natural for them to be mothers rather than managers. This notion of women is a construct produced within a specific historical milieu and one that has been reconstructed differently within our milieu.
It is this concept of construction and its relation to the human that I want to use as a point of entry into cyberculture, since one of the key assumptions of cyberculture theorists is that an alliance between humans and technology has made it possible for the construction of a new mode of existence, a new way of being human.
The origins of this idea can be traced back to a very inconspicuous experiment conducted by research scientists Manfred Clynes and Nathan Kline upon a white lab rat in the 1960s at Rockland State Hospital, New York. This rat had a very small pump implanted into it
allowing the researchers to inject it with chemicals so that they could control and observe certain features of its physiology. It was Clynes and Kline who coined the term “cyborg” and it is derived from cybernetic organism, a neologism intended to capture a new form of existence i.e., a cyborg: an existence that has incorporated the biological and the technological into an integrated information circuit.
Clynes and Kline, claimed in their paper, ‘Cyborgs and Space’ that this laboratory experiment paved the way for a future ‘augmented man,’ a physiologically improved human being. The new and improved human would be able to undertake space travel with the insertion of technological devices since a more endurable human body would be produced. They speculated that future astronauts would have their hearts controlled by injections of amphetamines, and their lungs replaced by nuclear powered cells.
These ideas of improving the human attracted both the medical and military establishments. By the mid-1960s, large budgets were allocated to research and development by the U.S. Air Force into the creation of robotic arms, biofeedback devices and expert systems. Likewise, the medical establishment considered it both feasible and realistic to produce better human beings with the aid of a number of technologies ranging from artificial limbs to pace-makers and lung machines.
It seemed that what had previously been just a dream propagated within the world of science fiction was becoming attainable with Clynes and Kline themselves turning toward metaphysical speculation on the new human ‘spirit’, on a higher form of evolution. Similar ideas had already been articulated within the world of theology by Teilhard de Chardin who had envisioned the universe as an interlocking network of communication and consciousness.
But the appearance of cyborgs as a new facet of human existence has been chiefly explored by Donna Haraway, who in ‘Manifesto for Cyborgs: Science, Technology, and Socialist Feminism in the 1980s’ (2004) describes them as ‘an ontologically new, historically specific entity.’ (2004: 299) Their appearance in the contemporary world is such that it is no longer possible to separate the world of technology from the world of humans. The cyborg is different from other forms of technology that have helped humans because ‘the machine and the organism are each communication systems joined in a symbiosis that transforms both.’ (2004:299) A cyborg is therefore not some special kind of being produced within a laboratory, the sort of being that is venerated or vilified within science fiction and popular culture. Rather, as Haraway points out, the cyborg is us.
Haraway’s figure of the cyborg owes much to the pioneering work of Norbert Wiener who, in Cybernetics, or Control and Communication in the Animal and Machine, identified similar processes at work in the completely different worlds of animals and machines: both depended upon the concept of information to perform certain functions from pumping blood throughout the body, to guiding missiles and running a company (or a university). The research of Weiner was clearly influenced by the then-acclaimed theory of information that had been articulated by Claude Shannon and Norbert Weaver, and whose information/transmission model of communication had introduced many of the central concepts utilised within the discourse of cyberculture. To name but a few, these include: information source, receiver, message, transmitter, signal, channel, noise, and feedback. The interesting feature of contemporary life is that perhaps without realising it, the language of technology has permeated our everyday lives: when relationships are in trouble, we speak of ‘communication breakdowns’ and consider re-opening the ‘channel of communication’; when we want others to know what is happening at work or even socially, we ‘keep them in the loop.’
However, it is the concept of feedback that is crucial to information systems as they depend upon feedback to regulate their responses. The word cybernetics comes from the Greek word ‘kubernetes’ which means ‘steersman’ and it captures vividly the sense of information systems: just as a steersman constantly reacts to the information he receives from his context so as to steer his ship, likewise, information systems adjust to the changing information received from their environment. In Weiner’s sense, a human and a thermostat are both information systems that react to each other or their context. Differences between them are differences in the degree of complexity, not in kind.
Although the utopianism that followed Weiner’s research has virtually disappeared on account of its tendency towards generalisation without attention to details, Hari Kunzru points out that the legacy of Weiner’s cybernetics is twofold: (a) the world is now seen as a series of interconnected networks and (b) the sharp line differentiating the biological and the technological is no longer tenable.
The unravelling of the opposition between biology and technology is just one in a series of deconstructions that are now taken for granted within the (American) scientific community. The other opposition that was once sharply demarcated but whose boundaries are now blurred is the distinction between the human and the animal. I shall briefly review these distinctions as they constitute the conditions that made cyberculture and the existence of the cyborg possible.
- The human-animal distinction: this distinction is an old favourite with those who want to maintain the uniqueness of humans by opposing them to the rest of the animal kingdom. Various features have been suggested as markers of human identity: these
range from tool-making, to language usage, to the possession of a mind or soul, to their creation by God. However, research from a variety of fields has downplayed the notion of human uniqueness as animals have increasingly shown those same capabilities that were once considered exclusively human.
- The biological and the technological: this blurring of the distinction between what counts as an organism and what counts as a machine is central to cybernetics. In the past, machines were considered as tool-like, subject to the will and command of humans: at best, they had human-like qualities that were implanted into them by humans. Machines today are different in that they are ‘disturbingly lively’ while humans are ‘frighteningly inert’: what has come about is that the features that had marked the human from the machine have been reversed such that the distinction between them is now blurred.
From the late 20th century onwards, a new relationship between the human and the technological has been developing, a relationship that has been framed by the concept of the network. The network is involved in both the external and internal aspects of this relationship. The external relationship is one in which humans use these new technologies that are in turn themselves embedded within a networks of relations: our computer connects us with other computers and other humans; our cars are connected to the technologies of their production that research safety, speed, etc; social events, such as going out for dinner, connect us to various technological platforms, with, for example, the waiter typing in our order onto a hand-held computer that transmits it into the kitchen. With the Internal relationship the new technologies are actually incorporated into the body: pace-makers, artificial limbs, and silicone implants are now common procedures.
But what contemporary life shows us is that the whole notion of a strict separation between the external and the internal is dated. While, for example, we might think of the foods we ingest as something exclusively internal, in effect, they have been altered and modified – for better or for worse – by the agriculture industries. In the age of the cyberculture, the concept of the network as a paradigm for understanding the world has replaced the dualisms (inner-outer, body-soul, subject-object, man-nature, etc) that dominated western culture since René Descartes ‘inaugurated’ modernity.
According to Descartes, the starting point of reflection on how to achieve certainty about anything in life should be the person him/herself, the subject who is doing the reflecting. By doubting everything that s/he had taken for granted, Descartes was able to conclude that the only thing a person can be certain of is that s/he exists because it is the existing person who is ‘doing’ the doubting. However, while this solution might seem neat, it led to an impasse for Descartes in that, while he was able to establish with certainty his own existence, he found it hard to justify the existence of the external world. An abyss was created between the inner mental world of the person and the external objective world of nature, an abyss that could only be bridged by re-introducing God as guarantor of the external world. The upshot of the Cartesian view – a view that was bequeathed to, and vigorously challenged in the 21st century – was that humans were solitary individuals, atoms locked within their own bubble or world and disconnected from each other. From a political and social perspective, while the values of modernity – individualism, freedom and progress - were considered a vast improvement on the medieval world view that valorised tradition and authority – it is now recognized that the dualistic world of the (human) subject who stands apart from the world (of nature, of others) has led to several social and ecological problems. In this respect, the medieval world view of the great chain of being where everything that exists is interlocked and interconnected with everything else anticipated the contemporary networked view of the world.
Haraway’s concept of network and the cyborg shows the redundancy of Cartesian dualism, chief among which is the opposition between subjects and objects. This has now given way to a networked relationship in which humans and technology cohabit and interact in the same technologically networked world. By way of elaboration, I would like to focus upon Haraway’s example of the Olympics: while one takes part in the Olympics with the purpose of winning, to be able to participate involves a number of interconnected variables such as the proper diet, training, clothing, equipment, etc. These characteristics constitute the world of the Olympic athlete who in turn is a node within this technological network. Someone might comment that jogging along the Sliema front is very different from an Olympic athlete: yes, but it is a difference in degree not in kind. Presumably, the jogger has bought some form of running shoes that required specialised research in order to be produced. These shoes are, in fact, the end result of a technology dedicated to production of the perfect running footwear. Such a notion is a relatively recent innovation since, as Haraway points out that during the American Civil War right and left feet were not distinguished by shoe manufacturers.
Given the pervasiveness of technology in our lives, the materiality of the human in the 21st century consists not only in flesh and bone, but flesh, bone and technology. Haraway’s introduction of the cyborg as a cybernetic organism that functions within a
network describes this new reality. Cyborg existence constitutes a network connecting the human and the technological in a symbiotic relationship, each feeding into the other. In an interview with Wired, Haraway claimed: ‘Technology is not neutral. We’re inside of what we make, and it’s inside of us. We’re living in a world of connections – and it matters which ones get made and unmade.’ (1997)
It is the last part of this citation, ‘it matters which ones [connections] get made and unmade’ that leads me back to the theme of this paper, namely the role of the humanities in the age of technoculture. Given the emphasis upon the world as a network of connections then, the way a connection between the humanities and technology is configured ‘matters’.
It ‘matters’ because, whether we accept it or not, the age of technoculture is part and parcel of human existence in the 21st century; and its presence in our lives will not remain at a standstill but pervades our lives in an accelerated fashion. As I have argued, technology is not opposed to the humanities but integrated within it: the role of the humanities in this new material existence is that of finding the appropriate language with which to talk about this new world. In other words, it is the relationship itself, the relationship between the human and the technological that carves out a new space for the humanities, a space that has needs to move away from the restricted margins of science fiction towards the centre of humanistic studies. If the role of the humanities as traditionally conceived has been that of narrating the trials and tribulations of human existence, then the presence of the humanities is not only justified but even more necessary in the age of technoculture in that it provides the platform with which we can continue the ongoing tasking of trying to understand ourselves.