posthumanism

By Lene Tøftestuen, 26 May, 2021
Language
Year
Record Status
Abstract (in English)

While Covid-19 may have fractured our public, private, and professional narratives of normalcy, out of this slow-moving and surreal catastrophe, new images of the future imaginary began to emerge, as well as new creative practices for collaborative (re)imagining. The Digital Literacy Centre is a collective of researcher/academic/artists at the University of British Columbia who are interested in exploring innovative approaches to literacy, digital media research, and experimental methodologies for technologically enriched meaning-making practices and collaboration. Like everyone in the world, each of us in the DLC experienced the pandemic individually as a diffracted and intensely intimate encounter and yet also collectively, as a shared story, one that we were narrating together in real time, however virtually. We decided to take up this evolving pandemic moment as a technological and creative research challenge to engage with the innovative digital platforms at our disposal towards collaborative futures imagining during a time of crisis. Skunk Tales is the result— a multimodal, collaborative futures fiction that we wrote/composed/sonified/and performed in chapters that map an imagined future of human interactions with literate technologies.In this paper, we describe a collaborative, technologically-mediated storying methodology that enacts “the diffraction patterns that arise when specific aural experiences are rubbed against specific narrations of human-technological coupling” (Cecchetto, 2013, p. 3). During our storying sessions, we simultaneously sonified the emergent narrative data using Singling, a Text-to-MIDI (Musial Instrument Digital Interface) linguistic data sonification software. We developed Singling for the sonification and visceralization of textual data in qualitative research and analysis. Capable of sounding discrete characters, symbols, and punctuation, as well as word forms in lexicogrammatical categories of English language texts, Singling transforms text into user-determined soundscapes. As we wrote Skunk Tales, we invited the emergent soundings to permeate the futures imagining and become entangled with the movements of the narrative. As Cecchetto (2013) argued, “It is precisely the forceful quality of sound that makes it an agent of modulation that can help to amplify certain elements of narratives of human-technological coupling, making them audible” (p. 4). This paper maps our creative futures research generation that is informed by technological posthumanism and how “different technologies of text production suggest different models of signification… initiat[ing] new experience of embodiment; and embodied experience interacts with codes of representation to generate new kinds of textual worlds” (Hayles, 1993, p. 69). Sound permeates the methodology and the resulting diffracted narratives, both theoretically, materially, and thematically.We first began the narrative face to face, in the early days of the pandemic, and then diffracted outwards into social isolation and virtual jam sessions; we extended the narrative beyond the limits of our collective and into a storying performance at the 2020 Artful Inquiry Research Group virtual conference, during which we wrote and sonified a chapter live in virtual space. As such, Skunk Tales is a pandemic tale, sounding the evolution of a future now receding into the past, while simultaneously signifying new possibilities for dynamic arts-based conversations between subjectivities, technologies, sounds, and meanings.

(Source: authors' own abstract)

Creative Works referenced
Description (in English)

This short video work was filmed in New York in 2000 and involves a plastic owl reading Bill Joy's text Why the future doesn’t need us, published in Wired magazine in 2000. The text outlines a dystopian future where humans a rendered obsolete and are replaced by the sentient beings they created. The plastic owl hose sole purpose is to scare pigeons from the rooftop of the house in the west village spins whilst the words are whispered and the pigeons continue to go about their business paying no regard to it.

Screen shots
Image
screenshot of work
Image
screenshot of work
Multimedia
Remote video URL
Description (in English)

 

In a 1980 interview with David Remnick, John Ashbery describes the formative impact that the poetry of W. H. Auden had on his writing: “I am usually linked to Wallace Stevens, but it seems to me Auden played a greater role. He was the first modern poet I was able to read with pleasure…” In another interview Ashbery identifies Auden as “one of the writers who most formed my language as a poet.” For Auden’s part there was a mutual yet mysterious appreciation for the younger poet’s work; Auden awarded Ashbery the Younger Yale Poets prize for his collection “Some Trees”, with the caveat: “...that he had not understood a word of it.”

 

This web based exhibition presents a creative experiment using OpenAI’s GPT-2 and traditional recurrent neural networks to develop a generative poetry pipeline loosely modeled after this short narrative describing the dynamic between Ashbery, Auden, and Stevens. While this modeling is subjective and playful it aims to map the relationships between the three poets appropriately into different aspects of a machine learning framework. By exploring the potential of using social and personal relationships and the narratives they imply as inspirational structure for designing generative text pipelines and creating “Transformative Reading Interfaces” that explicate the relationships between the training corpora, the machine generated text, and the conceptualization of the artist.

Screen shots
Image
screenshot of work
Image
screenshot of work
Image
screenshot of work
Image
screenshot of work
Multimedia
Remote video URL
Description (in English)

 

The Hollow Reach is a choice-based virtual reality (vr) experience built on becoming posthuman to overcome the trauma of emotional and physical loss. What at first appears to be an adventure game turns out to be an exploration of psychological and physical recovery, not a retreat from reality but a coming to terms with it. In this interactive puzzle game, virtual reality offers a space of recuperation through adaptation and prostheses. In this piece, the player progresses from the human to the post-human only by letting go of their notions of what is and is not under their control to encounter a life augmented and transformed by the digital.

Multimedia
Remote video URL
Description (in English)

Voidopolis is a digital performance about loss and memory that is currently unfolding over 40-ish posts on my Instagram feed (@kmustatea). It is a loose retelling of Dante’s Inferno, informed by the grim experience of wandering through NYC during a pandemic. Instead of the poet Virgil, my guide is a caustic hobo named Nikita. Voidopolis makes use of synthetic language, generated in this instance without the letter ‘e’ and the images are created by “wiping” humans from stock photography. The piece is meant to culminate in loss, so will eventually be deleted from my feed once the narrative is completed. By ultimately disappearing, this work makes a case for a collective amnesia that follows cataclysm.

Screen shots
Image
screenshot of work
Description (in English)

 

in a planet earth with out humans a old chinese poet is still alive a fight vs the alien occupation and extraction of earth. Is the battle of the carbon based life in planet earth. So a ambassador from the year 8888 came to 4444 to hear the poems of Li Po. Then, the antidote to capitalism that we know thanks to Li Po poems is send to our times for sell as Rice to prevent the alien occupation. 8888 is the code that the future send with the antidote.

Multimedia
Remote video URL
Description (in English)

Abstract: As a project that is situated between “the print” and “the digital” and as one that places print-based artifacts in conversation with digital artifacts, “not a book” is concerned with the histories, presents, and futures of books and the technologies of reproduction and replication used to make them.  Created from digital images of the traces left from the original copper engraved botanical prints on the interleaved blank pages of a digitized edition of one printed copy of an 1844 issue of “Flora Batava” magazine, the project reflects on and raises questions regarding just what a book is and was by delving into the history of “the” book as a collection of historically contingent technologies and social processes.  Seeking to document and understand how the material traces of bookmaking processes and technologies become legible in new ways once they are reframed and accessed in the context of new technologies of replication and reproduction, this project offers viewers an opportunity to reflect on the ways in which histories of print technologies are embedded in digital technologies and how the “not a book” image functions both literally and metaphorically as a “digital negative” of the printed original. 

Multimedia
Remote video URL
Description (in English)

To Pray Without Ceasing is a web app that autonomously prays for people. It searches Twitter for expressions of need (e.g. "I need somebody to hug me right now" or "I need more money in my bank acct wtf"), especially those tweeted by users who have few followers and who are perhaps in need of solicitude. It then issues prayers for them using a variety of NLP techniques. Visitors to To Pray Without Ceasing must activate the system's prayers in a simple but symbolically significant way: they must light a candle (while making sure not to move the cursor too fast; one must proceed mindfully in sacred space). The action of lighting a candle is designed to make the system not "interactive" but rather what Robert Pfaller would call "interpassive"; the visitor delegates the work of praying---the practice of religion itself---to the machine, yet she still can feel vaguely responsible for whatever good work it does, whatever good words it utters. The system prays in different ways over the course of 24 hours, evoking the "Liturgy of the Hours" ("Horae Canonicae"). After 24 hours the sequence begins again, praying for a new batch of needs discovered on Twitter from the previous day. Thus the humble and pious work of paying attention to the needs of strangers is never finished.

Multimedia
Remote video URL
Description (in English)

A “book post” is placed in the UiB Humanities Library during March 2021, consisting of a table/desk with two stools by it, near a wall.

Four books are on the table/desk (left to right, in alphabetical order by title): Articulations (Allison Parrish), Golem (Nick Montfort), A Noise Such as a Man Might Make: A Novel (Milton Läufer), and Travesty Generator (Lillian-Yvonne Bertram). Each has a hole drilled through it in the upper left and is secured to the table with a cable, creating a chained library. The books represent the work of four participants in an SLSAeu panel about computer-generated literature.

A Kodak carousel slide projector is in the middle of the table/desk, projecting small, bright images and texts onto the wall. Slides presenting covers and contents of the five books are shown continually during the exhibition. The selections will be made in consultation with all author/programmers and with their approval.

The stools allow two readers to sit and peruse the books. The table is wide enough to allow readers to do so while socially distanced.

The presence of a functioning “obsolete” slide projector, and the establishment of an “obsolete” chained library within the Humanities library, suggests to visitors that the book is also obsolete — while it is, at the same time, a perfectly functional technology. The dissonance of presenting computer-generated text via film slides and analog projection resonates with the decision that this group of five author/programmers has made: to present our computational writing in codex form.

The chained library is both practical and symbolic. Given that this is a library exhibit, the cables prevent people from relocating the books as one typically does in a library. They also emphasize that while we value ubiquity and portability in the digital age, at the same time we want things tethered, grounded, and available at the expected location. This suggestion will be strengthened by the similarity between the way these books are tethered and the way computer equipment is secured to a desk.

The projection of course alerts visitors to the availability of the books. Even if visitors do not choose to sit and peruse these books, the projected texts allow them to see and read computer-generated writing from recent years. Those who only view the projections nevertheless get a sense of the wide variety of approaches and the many textures of language that are seen in this sort of experimental digital writing.

Description (in English)

 

“I live on Earth at the present, and I don't know what I am. I know that I am not a category. I am not a thing –a noun. I seem to be a verb, an evolutionary process– an integral function of the universe.”

– Buckminster Fuller, from I Seem to be a Verb, 1970

 

‘Bucky’ Fuller’s well-known quote, originally published in his book I seem to be a verb, (1970) contrasts human participation in the material world (which Fuller suggests can be described with nouns) and the ongoing evolutionary processes which influence and shape that world (which Fuller suggests can be described with verbs).

 

The web-based "A.I. seems to be a verb" (2021), automatically identifies and maps speech, not only as linguistic functions (e.g. nouns, verbs, adjectives, pronouns, etc.) but also across a spectrum of sentiment from negative to positive, in order to generate a complex array of paratextual supports (typeface, page-design, rules and symbolic elements and word-prompts) used in the visual representation of the text to the screen. The entire process happens in real-time, providing an uncanny ‘mise-en-abyme’ experience which contemporaneously engages the participant’s auditory and visual responses to language construction.

Screen shots
Image
screenshot of work
Image
screenshot of work
Multimedia
Image
Gif of work in action