Jeffrey West Kirkwood on “The New Alphabet” at Haus der Kulturen der Welt, Berlin
At the opening night reception for the “New Alphabet” project at the Haus der Kulturen der Welt, a conversation with an ex-scientist reminded me of an episode in institutional satire that I had long forgotten. In 2008, the defense juggernaut Lockheed Martin solicited proposals from Princeton University for a $750,000 research grant. The call was taken up by professors D. Graham Burnett (history of science) and Jeff Dolven (English), who submitted a proposal called “Irony in the National Defense.” The conceit was to develop a program of empirical research in the hard sciences dedicated to irony’s mastery and “mobilization in the interest of national defense” – the mordant suggestion being that despite a deep history of study in the humanities since there was a “the humanities,” no real work had yet been accomplished toward understanding its formal, actionable make-up. [1] Burnett and Dolven’s proposed research thus involved things like a “searchable, networked database of all known ironies” they ironically called the “Ironic Cloud,” with an eye to weaponization. [2] The proposal was a masterful layer cake of humor and critique. But it was also a forerunner of more earnest and eerily similar research agendas reimagining human systems of meaning-making as mere obstacles awaiting technical clarification.
In 2011, the Intelligence Advanced Research Projects Activity (IARPA, DARPA’s younger, less famous sibling) turned parody into reality, inviting proposals for a program that would develop “automated tools and techniques for recognizing, defining, and categorizing linguistic metaphors.” [3] And in a movement of rhetorical hubris containing neither irony nor metaphor, the program description acknowledged that “metaphors have been known since Aristotle,” but that it is only in “the last 30 years” that “metaphors have been shown to be pervasive in everyday language.” [4] Prior to that we clearly had our heads in the sand – metaphorically speaking, of course. Only automated computation could resolve the indeterminacy of semantic operations that previously demanded crude, primitive implements like hermeneutics or interpretation. With these new machine learning tools, one could finally “exploit” (IARPA’s word) the “use of metaphor” to “gain insight” into “cultural norms” – the intelligence community goes Clifford Geertz. [5]
This military-literary complex lays bare a practical and theoretical irreconcilability that forms a backdrop for the HKW’s massive new three-year project, “The New Alphabet.” Namely, it exposes the perceived opposition between the human category of “meaning” and the pervasive digital protocols now used to identify, sort, produce, monetize, and weaponize it. Although the HKW received a mind-bending sum (albeit not by defense contract standards) from the Federal Government Commissioner for Culture and the opening days included a rather jovial introduction by Rüdiger Kruse, a member of the Bundestag, most of the events were haloed by suspicions about the governmental and corporate threats posed by the ubiquitous discretization responsible for new digital marketplaces (and warzones!). The project is the third and final part of the cultural institution’s series of world-historically scaled, “longitudinal” undertakings confronting the epochal constructs that both enable and frustrate attempts to come to terms with a precarious human present (the others: the “Anthropocene Project” and “100 Years of Now”). Thus, when curators Bernd Scherer and Olga von Schubert ask whether “we need a new literacy” to accord with the invisible binary processes that now underlie – and, by extension, shape – all forms of cultural production, the longevity of the human endeavor (and fate of the planet) is tied to humans’ ability to once again render our own world legible. [6]
As with many of the spectacularly manifold events held at the HKW, the opening days of “The New Alphabet” set up a prismatic encounter with issues of digitization through, among other offerings, video installations, orchestral haikus, comedy performances, closed workshops, and talks by professors, artists, and internet industry professionals. If, as Sybille Krämer’s characteristically pointed and energetic talk suggested, we are experiencing a two-fold Verflachung (flattening) – in which everything is reduced to an image and every image to a binary sequence – you could not tell from the stand-out lineup, which included Hito Steyerl, Trevor Paglen, Richard Sennett, Lorrain Daston, and Alexander Kluge (who, even at the age of 86, seemed to be on every stage at once). Many of the speakers frequently called attention to the deep historical origins of the seemingly recent “digital age,” pointing to the theo-philosophical binary system proposed by Gottfried Wilhelm Leibniz in the seventeenth century. But even Leibniz’s treatises were printed on paper in a familiar Roman alphabet. What happens when every literary text, medical diagnosis, first date, and drone strike runs on a machine language?
This is a worry already partially diagnosed by Friedrich Kittler (himself military-minded) in his famous early ’90s essay “There Is No Software”: “We can simply no longer know what our writing is doing, least of all when we are programming.” [7] Best case, Kittler’s pronouncement was politically agnostic. Worst case (to borrow a term from algorithm analysis), it delights in the hapless subservience of creators to their mechanical creations, praising the boundless possibilities of the bounded systems they have ceased to understand. The infinite tape of Turing’s universal computer does not exist, and so the fantasy of infinite computability, the clean surfaces of GUI (graphical user interface) interactions, the power of algorithms to sort and select are all hemmed in by the frustratingly material limitations of thermodynamics. It is truly amazing that we can live-cast talks in Berlin to the distant reaches of the globe and that Instagram knows the pattern ratios of our dearest friends’ faces, and whether they are sad or happy. Then the CPU gets hot and the fan turns on.
So at what level do we now find a “new alphabet,” 27 years after Kittler claimed that algorithms deceptively masked the operations of hardware and more than 300 years after Leibniz’s “Explication de l’arithmétique binaire”? Much of the attention turns rather intuitively back to algorithms. In his program essay, Scherer notes that “machines of the digital world operate on the basis of algorithms” and that “binary code provides the symbols” used by the physical machine to “implement the algorithms.” [8] At the same time that this has made knowledge more abundant, transmissible, and accessible, we are informed that “traditional categories and concepts” that account for knowledge “no longer serve to help us understand this transformation.” [9] To address this automated black-boxing of the human universe in light of machine learning, deep learning, and neural networks, speakers acknowledged that it might be too late to look under the hood of machines. In a conversation between Kate Crawford and Trevor Paglen about the infelicitous taxonomic biases that emerge when training facial recognition software using image libraries (including DARPA’s FERET [they’re everywhere!], MNIST, and the woefully named Japanese Female Facial Expression Database [JAFFE]), they suggested that the problem might be us.
The implication is that the issue is largely with our categories – that machines merely make obvious the ugliness of assumptions normally dressed up in humanistic finery. The danger might not be the algorithmic method of “face clustering” but the desire to cluster faces in the first place. No matter how grim that harmonic interaction between humanistic categories and their rapid computational amplification may seem, however, it is optimistic in that it grants us social agency, even in cases of unsupervised machine learning. This aspiration toward the efficacy of the digital Sartrean prisoner was captured beautifully in Hito Steyerl’s talk, accompanying her work The City of Broken Windows. By acknowledging the pessimism of living in the shadows cast by black boxes, she wondered whether changing the shape of those shadows might reverse-engineer different black boxes. The “hardwired ideologies and preferences” that yield apophenic (and potentially malevolent) patterns can only be judged nonsense by humans – and meaning gets to retain its pride of place against the fury of automated taxonomy-making. [10] After all, a statistically significant pattern in the noise does not necessarily make for a meaningful signal – no matter how many cats Google DeepDream dreams up. If we have to live with culture machines, as Ed Finn has claimed, perhaps there is something like an “experimental humanities” that allows us to “build culture machines of our own.” [11]
What can sometimes get lost in digital doomsdayism, even in its most valiant humanity-defending form, is a core complexity of signification that prevents the easy substitution of culture for machine. The sign is overdetermined. We know this as a shared truth of psychoanalysis and poststructuralism. The ability to reproduce writing is not the same as being a good reader. No wonder IARPA dedicated a vast pot of cash to designing military-grade boots that might get traction in the slippery terrain of language. Meaning is reducible to neither Karnaugh maps nor algorithms built from binary sequences. Category theory may streamline programming but will tell us very little about the nature of human categories. Perhaps it is a vestigial Marxism at work that demands we know the material “conditions of possibility” involved in the production of cultural objects that serve to obscure them. But the show also effectively highlighted how little we knew about ourselves to begin with.
The rise of machine learning has not supplanted hermeneutics but entered into a convoluted exchange with it. Joseph Vogl precisely marked this fact by referring to Google’s search engine as the “algorithmic auction house for linguistic products.” The allure of natural language processing, big data, and new advertising ecosystems based on them is not that they have demystified culture. Rather, they have covertly introduced a market-based fungibility between language and capital that encourages a belief in the substitutability of language and machine sequence. Fungibility is different from mechanization even if the two are historically inseparable. This failure of substitutability was performed hilariously in a short video work presented by Giulia Bruno. In the video, two chatbots interact with one another. The conversation begins with standard-issue niceties but quickly escalates into a vertiginous spiral of near profound nonsense. The shape of the chatbots’ dispute is familiar to us, even if it is meaningless, and we are placed in the position of pattern-detecting algorithms as we begin to intuit the nature and origin of their semantic misunderstandings. As such, the title of the HKW project turns out to be provocative: even as the evolution of culture has become dependent on them, the basic units of machine language can never become an alphabet.
Many of the talks and events were keen to carefully reveal the diagonal rather than diametric relationships between codes and meaning, machines and humans, computers and culture. Perhaps most striking was Guiseppe Longo’s discussion of the fictions of completion and determinism entailed in cracking the human genome. No one gene determines an outcome, just as meaning cannot be decoded or programmed word for word. This is something that can be lost in the rush to digital fatalism that is sometimes baked into the polysemic idea of “programs” and “codes” motivating the spate of recent international shows on digital art and algorithms. If we trust the Jesuit guru of media theory, Walter Ong, when he observed that “the most remarkable fact about the alphabet no doubt is that it was invented only once,” then the importance of the HKW’s project lies in reminding us that we are already rather adept readers. [12]
“The New Alphabet,” Haus der Kulturen der Welt, Berlin, 2019–21.
Title Image: Giulia Bruno und Armin Linke, Gerichtshof der Europäischen Union, Dolmetscherkabine am „Tag der offenen Tür“, 2018.
Notes
[1] | D. Graham Burnett/Jeff Dolven, “The Ironic Cloud,” https://harpers.org/archive/2009/07/the-ironic-cloud/. |
[2] | Burnett/Dolven, “The Ironic Cloud.” |
[3] | IARPA, “Metaphor Program Board Agency Announcement,” https://www.fbo.gov/index?s=opportunity&mode=form&id=20ff241cdc2146dc147b4014730fc807&tab=core&_cview=0. |
[4] | IARPA, https://www.iarpa.gov/index.php/research-programs/metaphor. |
[5] | Ibid. |
[6] | Bernd Scherer/Olga von Schubert, “The New Alphabet: Foreword,” in: The New Alphabet: Opening Days, p. 7. |
[7] | Friedrich Kittler, “There Is No Software,” in: The Truth of the Technological World: Essays on the Genealogy of Presence, trans. Erik Butler, Stanford, CA: Stanford University Press, 2013, pp. 219–29, here p. 221. Originally published as “Wir können schlichtweg nicht mehr wissen, was unser Schreiben tut, und beim Programieren am allerwenigsten,” in: “Es gibt keine Software,” in: Draculas Vermächtnis: Technische Schriften, Leipzig: Reclam, 1993, pp. 225–44, here p. 229. |
[8] | Bernd Scherer, “Das Neue Alphabet/The New Alphabet,” in: The New Alphabet: Opening Days, p. 9. |
[9] | Scherer, “Das Neue Alphabet/The New Alphabet.” |
[10] | Hito Steyerl, “A Sea of Data: Pattern Recognition and Corporate Animism (Forked Version),” in: Pattern Discrimination, Minneapolis, MN: University of Minnesota Press, 2018, p. 9. |
[11] | Ed Finn, What Algorithms Want: Imagination in the Age of Computing, Cambridge, Mass.: MIT Press, 2017, p. 193. |
[12] | Walter Ong, Orality and Literacy: The Technologizing of the Word, New York: Routledge, 2012, p. 88. |