One explanation for surveillance capitalism’s many triumphs floats above them all: it is unprecedented. The unprecedented is necessarily unrecognizable. When we encounter something unprecedented, we automatically interpret it through the lenses of familiar categories, thereby rendering invisible precisely that which is unprecedented….A tragic illustration is the encounter between indigenous people and the first Spanish conquerors. (Zuboff, 2019 p. 12)
When the conquistadors beached in America the natives could not comprehend that the new arrivals would undertake a pandemic of slavery and murder in Christ’s name, an unprecedented face-off that flipped daily life for the welcomers. Today, led by the data mining mobilised by Google in Fordism’s footsteps and in the name of open democracy, for Zuboff, an unprecedented digital frontier opens up that globally colonises and infects all our bodies and behaviours.
At the 2018 Virtual Reality Software and Technology conference in Tokyo I witnessed a keynote by Prof. Masatoshi Ishikawa from the University of Tokyo. His high-speed image processing technology exceeded the human perceptual apparatus, referred to as meta-perception: “Processing speed faster than human and total latency lower than human”. He presented moving imagery of a maintenance van with this technology on the roof moving down a tunnel at a breakneck 100mph, filming at 1000fps, inspecting the tunnel for cracks. This patterned imagery looked like Thorstein Fleish’s Silver Screen (2000). Ishikawa also demonstrated a robot hand holding a baseball bat, that hit a pitched ball without fail. He explained that with this processing speed it was now possible to build a robot that would hit a home run every time.
I imagined in human terms it was like the ball was moving so slow that the hitter could dance around it, ruminate to uncover the best move, and swing; plenty of time. For the computer a second was like a human hour. Through such ever-expanding processing speed an unprecedented invisible superhuman vista had opened up for AI that can move Google’s surveillance mining into a pre-cognitive space. It is a space predicted in Superman movies (from 1978) and comics (from 1938). Such frozen or bullet time vistas was envisioned by Samantha’s twist of the nose in the Bewitched (1964-72) television series, performed by Marty McFly as he replaces a Coke with a Pepsi in a person’s frozen hand in Back to the Future (1985) and explored in features such as Dark City (1998) and Clockstoppers (2002). I am also reminded of Tim Mcmillan’s Time Slicing Experiments (1983) explained in this BBC short (https://www.youtube.com/watch?v=fIpmUi8HI1k) which led to the bullet time sequence in The Matrix (1999). In Time Slicing when the viewer moves through space, time appears frozen.
For me the technological moment signified by Masatoshi Ishikawa’s keynote was as significant as Muybridge’s animal locomotion studies in 1877 that revealed the pacing of a horse’s feet and informed the nature of cinematic illusion. Ishikawa’s research marks the moment when we have time-travelled past the limits of human perception through AI. Importantly the humans that predictively perform their superhuman skills in popular culture are a metaphor for a technology whose benevolence, for Zuboff, is of concern. Who controls this contested space? Hannah Arendt predicts a profound impact on the human condition ‘It is quite conceivable that the modern age—which began with such an unprecedented and promising outburst of human activity—may end in the deadliest, most sterile passivity history has ever known.’ (Arendt, 1998 p. 322)
New Aesthetics, New Anxieties
How do we think about media art aesthetics and the production of critical knowledge as the creative industries paradigm consolidates around us, amidst ongoing financial, environmental and political crises? Can we still claim a special place for media art given the increasing ubiquity of informational technologies in everyday life and the intensification of cultural distribution through social media platforms? (Michel Van Dartel, 2012 p.11)
Of course, your concerns about AI reflects the ambivalence toward complete surveillance. Well, yes, on one level, it has become that as a result, alas, of the mere total military entertainment complex which has emerged in the years since, I think, the end of World War II and the rise of what should we call the “Mad Men” period of the consolidation, really, of corporate and public relations interests working together to form what Guy Debord called the Society of the Spectacle. This is the global expansion of consciousness and image and ideological expansion globally to form a kind of seamless, uninterrupted globalized monoculture that we’ve all had to experience in our generations in Western Europe and Australia and the U S.
How do algorithms fit into this? Well, the feedback loops that were required during World War II in order to calculate the best way to fire artillery were the impetus for the first computers.
Alan Turing also was able to adapt ideas from the Poles to use computers to automate the decrypting of codes sent using the Enigma cypher machines during the war. So these two sort of extreme developments of decryption/encryption and solving tabulation problems were the latest in a long line of earl computer development. Note that both computing solutions combined space and navigation. Range and distance of artillery, and the location and meaning and time of an encrypted message. Tabulation was thus linked early on managerialism, management, coordination as well as mapping and terrain. Babbage’s idea for the Difference Engine for example was to solve a problem with tabulating tables for mapping and charts for the Royal Navy.
So we see a trend. We see a trend of computers in the service of the management and organization of fleets. Fleets carry guns, fleets represent nation states, fleets do trade, trade, guns and this all adds up to a managed imperialism. So algorithms from day one have net by necessity by virtue of their context within the military system, have had that association.
However, on a purely independent route, from the point of view of mathematics and the public use of ideas and tools, algorithms have also enjoyed a non-ballistic, non-space aerospace context as well, although the rapid rise in algorithms emerged from the space race with telemetry for Rockets and so on, and communications and so on. The Apollo 11 mission being a very good example with the DSKY navigation computer ((for “display and keyboard”), and the Apollo code used for that, some of it written by Margaret Hamilton, which I did a mini opera about, about such characters but also the technologies used. In retrospect these technologies were the beginning of microelectronics and the small portable computer industry.
The Glitch = Grinch
Apparatuses were invented in order to function automatically, in other
words independently of future human involvement. (Flusser, 1983 p.73)
The glitch, the error codes 1202 and 1201 that came to screen in the Apollo 11 landing, mark an important historic moment in the emergent relationship between human and non-human computational control of technology, and was dependant on software engineered by Margaret Hamilton. Just seven and a half minutes away from landing on the moon the 1202 and 1201 codes unexpectedly hit Buzz Aldrin’s screen. There was a scramble at base to decipher their meaning. The code signposted a task overload: ‘I’m overloaded with more tasks than I should be doing at this time and I’m going to keep only the more important tasks’. ‘If the computer hadn’t recognized this problem and taken recovery action, I doubt if Apollo 11 would have been the successful moon landing it was.’ (Margaret Hamilton, 1971)
I am drawn to locating such a human/non-human transitions in Photography and Cinema? In Photography Bernd and Hilla Becher’s 1960s typologies documented the architecture of industrialisation through the programmed serial accumulation of water towers, gas tanks, and factory facades, predicting the repetitive algorithmic strategies of internet image searches and grazing.
Does a formal cinema’s structural language of logic and repetition predict AI? When Takashi Ito’s Spacy (1981) was shown at Oberhausen Peter Weibel told the audience that this film marked a new aesthetic of machine language. Masatoshi also showed a video of a robotic hand playing rock paper scissors which always won and a video of a camera locked into the centre of a moving ping pong ball. (https://www.edge-ai-vision.com/2012/08/vision-superior-robot-trumps-humans-at-rock-paper-scissors-ping-pong-balls/) That document was predicted by Croatian Ivan Galeta’s Water Pulu 1869 1896 (1987). Through optical printing the ball in a water polo match is fixed at the frame’s center, forcing the action to rotate around its still core.
Laura Kraning’s more recent digital landscape film Meridian Plain (18 min, 2016) shifts and extends the scene of the human/non-human dialogue from the Apollo Moon landing to Martian terrain. Reminiscent of Michael Snow’s approach to his surveyed landscape in La Region Centrale (1971, 180 minutes) employing Pierre Abeloos’s robotic “Camera Activating Machine” (CAM), Kraning samples and animates an extensive archive of hundreds of thousands of images captured by the Mars Exploration Rovers (MERs) between 2004 and 2015 to unravel a perceptually challenging portrait of the planet’s terrain. Holly Willis’s description of Meridian Plain that ‘we might consider it not time-lapse so much as space-lapse, wherein spaces collapse and give way to each other’ (https://arts-sciences.buffalo.edu/media-study/news-events/recent-news/kraningfilmlabocine.html), resuscitates the pre-cognitive terrain envisioned and narrativized in the previously mentioned Bewitched (1964-72) and Clockstoppers (2004) as viscerally embodied perception.
Kalpana Subramanian connects Meridian Plain back to a cinema of attractions, especially early train films and phantom rides. (Subtramanian, 2021 p.71) Kraning’s non-human machine looking is predicted by Dziga Vertov’s Kino Eye. Kraning transfers the kinetic material force from Vertov’s ‘man with the movie camera’ into the inner networked circuitry of the Rover’s mechanical eye. (Subtramanian, 2021 p.69) This networked eye consists of two cameras set 12 inches apart, like a pair of human eyes, and three stereoscopic cameras on the front, back and top of the Rovers, with another camera on the robotic arm. This stereoscopy extends the stereoscopic flicker strategies Ken Jacobs developed in his stereoscopic image films Capitalism: Slavery (2006) and Capitalism: Child Labor (2006).
‘Kraning juxtaposes images from the stereoscopic cameras alternately (left and right), causing them to veer in extremes and twist in our perception, creating a bodily experience of distortion and instability. Some juxtapositions produce after-effects that appear like solarization or positive-negative inversions.’(Subtramanian, 2021 p.71)
In mathematics and computer science, an algorithm (/ˈælɡərɪðəm/ ( listen)) is a finite sequence of well-defined instructions, typically used to solve a class of specific problems or to perform a computation. (Algorithm meaning: Google)
So algorithms. I taught an algorithm several years ago at City College of San Francisco Computer Science Department when I was teaching Video Games, and I worked with an instructor who wanted me to explore teaching the games engine Unity from the context of games and simulation, where the simulator would be a pathfinding algorithm called Dykstra’s Algorithm. We were building 3D CGI grids using unity, where a “traveler” entity would traverse the grid.
The grid was made up of sort of cylinders and cones, and the cones would be nodal points, and the cylinders would make up regions (between the cones) made up of triangles like a Geodesic Dome Lattice and the traveller would find the best way to get to where it was going across this lattice based on different elements you would put into the algorithm to facilitate the quickest route. And you could vary elements within the terrain as it were (such as elevation, scale), and the terrain, of course, being made up of nodes and nodal points. There were variables you could change.
And this simple algorithm, pathfinding algorithm, it turns out it’s an excellent introduction to the idea of what algorithms actually are. They deal with regions they deal with recognizing if something is there or not, or if something is available or a condition is met. It’s basically basic computer science done where events have to unfold in a way where the machine itself has to take on board some of the computation, or if not all, of the computation, which leads to a sense of its own autonomy. And of course, algorithms and artificial intelligences can be autonomous, and the best uses for them are when they are able to be autonomous. Now this autonomy is in and of itself, of course ideologically neutral.
But this fear of loss of agency and the anxieties surrounding loss of agency have fed into a whole new aesthetics, like the New Aesthetic concept, which came out about 12 years ago, 11 years ago out of London’s design centre: Hoxton and Shoreditch.
James Bridle’s New Aesthetic Archive (https://new-aesthetic.tumblr.com/archive) on the Tumbler looked at this kind of interesting hybrid between people and AI. The AI is indicative of a new kind of formulation which is neither, let’s say, of human agency or non-human agency, but a kind of hybrid which has its own identity in its own codes and references. It is a form of agency online which is not human but should not be seen in any way, shape or form as separate from the normal functioning of the Internet.
So you have these parallel agents online which do as they do, and things happen as a result of what they do, and we have learned to live and work alongside them seamlessly and only now do we really really understand what this situation actually means.
And in the interest new ways of seeing, led by James Bridle, would have us bridge this gap consciously or overcome any automatic apartheid that might emerge as a result of anxiety about the AIs that are out there and now apparently outnumber human agency on the Internet. So just as with the emergence of steam and telegraphy and the main technologies at the turn of the previous century, from the 19th to the 20th, as Robert Hughes talks about in shock of the new, you know how this new shock of the of the new with AI cloud computing, sensor driven technologies.
It’s not so much the technologies, it’s really the public use of reason – Slavoj ZIzek has talked a lot about this. (see https://www.youtube.com/watch?v=ftI8IW_aItY and https://www.youtube.com/watch?v=qWkL0QwLllw)
As always, the tools are nothing inherently problematic in the tools. It’s in the systems to which they are deployed and the long term aims and objectives of those who operate the levers of power.
Let a thousand wikileaks bloom, as we might say. https://www.youtube.com/watch?v=QCRJfitEueU
Dykstra’s Algorythm Unity Demo on YouTube https://youtu.be/bGsrnJN7iwU
Known Unknowns – Harpers https://harpers.org/archive/2018/07/known-unknowns/
Arendt, Hannah (1998) The Human Condition Chicago: University of Chicago Press.
Flusser, Vilém. (1983) Towards a Philosophy of Photography. London: Reaktion Books.
Hamilton, Margaret. (1971) “Computer Got Loaded”, Datamation, March 1.
Subramanian, Kalpana. (2021) “Esoteric Archaeologies and Interplanetary Becoming in Laura Kraning’s Meridian Plain”, Papers on Language and Literature, suppl. Special Issue: Landscape, Travel, and the Gaze in Experimental Film and Video; Edwardsville Vol. 57, Iss. 1, (Winter 2021): 67-83,116.
Van Dartel, Michel (2012) Introduction in New Aesthetics, New Anxieties. V2 Publishing
Zuboff, Shoshana. (2019) The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power. Public Affairs, New York.
1and 3. Dirk de Bruyn
2 and 4. David Cox
David Cox is a filmmaker, artist, writer and teacher based in San Francisco. His films include Puppenhead, Otherzone, and Tatlin. His books include the nonfiction “Sign Wars: The Culture Jammers Strike Back”, published via LedaTape as well as the novella “Dr. Yes” and “The Mystery of the Mission” also published via Ledatape (Winter 2013).