Katherine Hayles – My Mother is a Computer

Chapter 1. Intermediation : Textuality and the Regime of Computation

Language alone is no longer the distinctive characteristic of technologically developed societies; rather, it is language plus code.

Language and code are now encountering each other in many ways all the time, and a way is needed to examine how this happens. Code, as well as being a technical language for computing, is now theorised by Stephen Wolfram and others as underlying the nature of the universe and all its systems itself.

Proponents of speech, writing and code regard each other’s fields with a certain amount of presuppositions, referred to as worldviews.

Turing proposed that a computer in any form should be able to begin with simple operations, and gradually add layers of complexity until we would eventually have complexity equivalent to human thought.

The Universal Turing Machine, as the name implies, can perform any computation that any computer can do, including computing the algorithm that constitutes itself.

Wolfram uses cellular automatons to employ simple rules to small black or white squares, which are continually updated sequentially and change themselves. Some of these demonstrate complex emergent behaviour, one even producing something resembling a Turing Machine. Since such complexity can emerge from such simple origins, he suggests that they can not only model natural systems, but are also capable of generating them. This is the computational universe.

Wolfram’s slide from regarding his simulations as models to thinking of them as computations that actually generate reality can be tracked at several places in his massive text.

Underneath the laws of physics as we know them today it could be that there lies a very simple program from which all the known laws, and ultimately all the complexity we see in the universe – emerges.

The computational universe is thought of as a metaphor for the formation of the physical world, even if it can’t be proven to have an actual ontological effect on its systems yet. This creates a feedback loop to the present which provides useful context.

Morowitz goes beyond Wolfram because he points out that once the cellular automata reach a certain level of complexity they can get no further. What needs to happen is that a first-order system leads to a second-order system, retaining and building on its complexity, and then you can have a third, fourth and so on. This will keep on going until we eventually reach a state of posthumanism.

Hayles reviews these ideas with some scepticism. Wolfram doesn’t get us from automata directly to natural processes, or from one order of complexity to the next. Morowitz points out this flaw but doesn’t provide a solution or indicate where to look for one.

Media can converge into digitality and simultaneously diverge into a robust media ecology in which new media represent and are represented in old media, in a process that Jay Bolter and Richard Grusin have called “remediation.”  note – Hayles prefers the term “intermediation.” Even as traditional media such as books become digitised, there is a feedback loop which then informs the older forms, and they reflect this absorption, making for a new set of contexts or ecology.

Notwithstanding their opposed viewpoints, Hansen and Kittler share a mode of argumentation that priveliges one locus of the human/machine feedback loop at the expense of the other. One says that the nature and quality of the media informs the subject’s interpretation of it, the other maintains that the subject retains their own autonomy and will experience their own personal reading of the media.
2. Speech, writing, code – Three Worldviews

A comparison between the qualities of speech and writing with code, Hayle’s position seems to be that code is superior to both. Speech is immediate, where writing can be stored and read much later.

Computer’s accuracy is tied to material, physical considerations, which is why they operate in binary. It’s easier to distinguish between two states than many. Same with language to an extent, we don’t have extremely long words for example.

In computers, changes in voltage are the signifiers, and the signified is how this in interpreted by other levels of code. The computer is continually translating between low level and high level languages and back again, much like the spoken and written word gets converted to a meaning of the word in our minds.

In language, it’s possible to have a signified without a signifier, an independent autonomous thought. In code this makes no sense, as each change in voltage must come from somewhere an lead somewhere else.

Computer code causes real changes in other physical systems in a direct tangible way, whereas language may lead to behavioural changes but these are mediated and open to interpretation.

changes to the language of code happen much more quickly and are more severe than changes to spoken language, and are subject to capitalist pressures (changes to Windows versions) The open source movement challenges that.

When we hear or read a word we don’t need to consciously run through a list of all word we know to identify it, it just comes. Computers can’t do that.

Author says that as reveal/conceal dynamics become more prevalent in digital media, it makes the computational model of the universe all the more plausible, allowing us to get closer to Morowitz’s fourth stage of evolution of mind reflecting on mind.

Talks at length about procedural programming languages ( c++ but Processing is similar) and about how classes and objects are formed, where behaviour can be modified and new objects can be evolved from those already present.

Sounds a warning about how we need to remain aware of how computers and big software companies like Microsoft condition us to become a certain type of subject while using the computer.

3. The Dream of Information

The notion that once we develop nanotechnology, we will be able to make everything we need and there will be no shortages, hence no need for oppression or war. Also, information is free to copy in the same way with no cost, except there is always a cost in terms of resources to keep networks running and in terms of pollution of old machinery etc.

Philip K Dick’s book talks about the notion of simulation and reality having no distinct barriers, and how the drug that induces the simulation ends up consuming the user, not the other way round. It is actually being used as a tool to control the populations of Earth and Mars, so ultimately is bound up in capitalist interests, going against the idea of information as having no real cost.

4. Translating Media

Discusses a website which puts William Blake’s literary works online, emphasising the original printed work as the superior version, taking care to recreate the original pages formatting and size as accurately on the screen as possible. Ironic that I’m reading this 2004 book on a kindle app now.

Schillingburg asserts that text translated from one form to another such as from printed text to computer text, still retains the same inherent and essential qualities. Hayles doesn’t agree at all.

Much talk of the Platonic “essence” of the book and how it becomes altered by the encoding into digital media. I’m not sure it makes sense to talk of this essence as being something relating to only the print version, as there could be many different interpretations from this alone.

Ok. Seems she agrees.

We need to let go of the old model of literature as a finite, static expression of immaterial essence and see it as something dynamic and changeable, inhabiting new forms of physicalities (computers) which infuse it with their own aspects of authorship.

Walter Benjamin has the idea of some form of higher language, like a Platonic ideal version of language, which exists above all languages in use. He suggests this gets hinted at when translators convert texts from one language to another, as they have to extract some essential quality to try to represent in the target language. Hayles thinks this isn’t how real language works though.

5. Performative Code and Figurative Language

Neal Stevenson’s Cryptonomicon, is a book which while printed as a paper text reflects the influences of the relationship between code and literature, while also encompassing the influence of capitalist forces such as Microsoft. (He previously lost a large file to software failure so is a big supporter of Unix)

A hierarchy between those who obdiently use computers and code, and those who actually understand how they work and so control matters. Uses HG Wells Time Machine as an analogy, the Morlocks in control, feeding off the Eloi whom they easily exploit. Parallel tension between Mac operating system, using metaphors like folders and trash cans while hiding the real workings underneath, and Unix, which allows the user to directly control matters without obfuscation. Capitalist control versus open-source utopianism.

6. Flickering Connectivities in Shelley Jackson’s Patchwork Girl

It was historically encouraged to think of the literary work, and by extension the author, as something disembodied from the physical media, something transcendental which had a high value. There was also the assertion that the author’s own creative genius produced the original work, but this contradicts the idea that when a work is written, the author must draw on pre-existing conventions and appropriate inherited ideas.

The character is a re-assembly by Mary Shelley of her female Frankenstein monster, with hyperlinks in the text leading to texts about the various people who were used to make her up. The body is a patchwork, a sort of committee vying for control. Reminds me of Susan Blackmore’s model of the memes and genes battling each other in our brains making our decisions, giving us an illusion of consciousness and free will.

We are not unified, discreet beings but assemblages, like the monster, of thoughts, experiences and influences. Unity is not natural or even desirable.

Within all of us resides a great deal of forgotten memories, which could make up an entire other person, or set of people. We are merely the combination of all the memories we retain. In this sense the monster, being assembled from bits of others but not having had the chance to develop its own unique memories, is like us.

7. (Un)masking the Agent, “Lem’s The Mask”

Does our analogue sense of consciousness actually have an underlying code like digital computer code? Could relate this to Blackmore’s theories and her denial of the self and free will.

Deleuze and Guattari see the unconscious as being driven by a version of Wolframs cellular automata, where the computations that are carried out constitute desire. Though Hayles says they take too many liberties with how the automata actually work to get there.

At the same time, machines evolve along lines that mimic biological evolution, and they begin to develop traits of expression, and are ultimately capable of desire. Hayles isn’t convinced of what actually drives this though.

Lacan considered the unconscious as a type of Universal Turing Machine, operating upon language in a linear way not requiring any kind of awareness to work. Again influenced by cellular automata. The beginnings of language are in this mechanistic unconscious state, from where we develop higher, conscious thoughts. Then we come back to how this relates to Freud and his concept of the death drive. Seriously, why won’t he just FUCK OFF!

In considering the relationship between human and machine and where consciousness comes from, there is a shift from thinking of nonliving/living to mechanistic intelligence/conscious awareness. Our unconscious operates like a turing machine mechanically, but then gives rise to our conscious selves.

The problem of agency. If we are evolved from mechanistic origins, then it doesn’t seem that our agency can come from our conscious mind, and also if machines can be considered like biological organisms, they have agency even though they are not conscious in any sense we recognise. Sounds good for Blackmore again.

The king wanted to assassinate Arrhodes, so made a cyborg woman with an insectoid robot inside her to do this. The king had sworn that Arrhodes must accept her of his own free will, so she was beautiful and would attempt to seduce him. There was therefore a battle between the woman’s conscious mind and the program that was designed to control her actions. Agency crisis again.

Given the mechanical nature of the creature, even consciousness must arise from code, for as noted earlier, she has been manufactured rather than born. In this sense, consciousness may also be a mask created to mediate between human readers and an alien core.

Whether conceived as literal mechanism or instructive analogy, coding technology thus becomes central to understanding the human condition.

8. Simulating Narratives, What Virtual Creatures Can Teach Us

Sims made a computer program to simulate the evolution and development of creatures by writing code which generated virtual creatures on a screen, by creating discreet modules which could be repeated with variation, to first produce individuals, and then populations, which reacted to a digital environment whose characteristics allowed for a form of natural selection to unfold.

Sometimes in these artificial life systems the creator intervenes and changes variables to address certain problems. Often the system that evolves is more complex or unpredictable than the author expected. The point is that it is often easier to let complex behaviour, or intelligence evolve than to attempt to design it. Relates closely to Wolfram’s theories about irreducible complexity – the only way to see what happens is to run the program.

Dealing with the nature of the type of reality that the simulations possess; they are at their lowest levels ones and zeros, but produce complexity by the running of algorithms which are not unlike those of natural selection which produced us who view them. Hayles suggest that this gives them a certain type of reality no less significant than ours. This also doesn’t prevent us from attributing anthropomorphic narratives to their behaviour.

The realist approach to perception is to give most importance to the physical, actual object we perceive through our senses, even though it is our own perceptual processes (eyesight, processing the information and recognition of the object) which we encounter first. The virtual creatures remove this step, since there is no underlying corporeal “reality” producing them, just a computer and some code.

These processes are what we experience when we attempt to understand the virtual creatures, and we are also drawing on similar processes when we arrive at narratives to imbue them with our tales of defeat, bravery, survival and so on. It is this emphasis on processes that Hayles is thinking of when she describes us as virtual creatures. Or something like that.

We can ultimately think of ourselves as “hybrid entities” when we consider our subjectivity as becoming involved in a sytem of understanding that includes interactions with all sorts of actors such as the chair we sit on, the book we take notes on etc.

 

 

Leave a Reply

Your email address will not be published. Required fields are marked *