A call for (and example of) material studies of software from Matt Kirschenbaum, spurred by the Digital Arts and Culture conference, 2000.
(Source: EBR)
A call for (and example of) material studies of software from Matt Kirschenbaum, spurred by the Digital Arts and Culture conference, 2000.
(Source: EBR)
Rev. of Expressive Processing by Noah Wardrip-Fruin:
Noah Wardrip-Fruin’s book inaugurates a new publication series by the MIT Press, one of Software Studies, and does it in an impressive way. Wardrip-Fruin states right in the beginning of the book his main impetus: “…it isn’t just the external appearance and audience experience of digital media that matter. It is also essential to understand the computational processes that make digital media function.” (p. xi) To emphasize his stress on the computational processes, Wardrip-Fruin has developed the notion of expressive processing. Under this umbrella, he is discussing things like artificial intelligence applications, simulations, story generators, computer games, and electronic literature.Expressive processing carries two separate meanings. When creating works for digital media, authors define rules for the system behavior. That is, computational processes are a means of expression for authors. The authorial take may be even more importantly located on this process level, than on the directly observable surface level. On the other hand, the computational processes express, through their design and form, decisions which may be connected to larger cultural contexts. Close examination of the processes may, in some cases, reveal quite different functionalities than the common descriptions of the systems may claim (even the authorial descriptions may prove quite misleading, as Wardrip-Fruin frequently demonstrates). Wardrip-Fruin stresses the importance of this latter approach for understanding digital media, and even more importantly, to understand software in general. This he sees as particularly important for the current information society, where it is crucial for people to understand, on the level of principles, the logic of software-based systems in such areas as surveillance.Wardrip-Fruin is extremely well read and familiar with the relevant theories in the field, but he is not a grand theory builder himself. His main strength lies in his ability to analyze, interpret, and explain works, by himself or others, in a way that reveals their processes and how the processes bear both authorial intentions and contextual influences. Also, his wide interests and expertise ranging from early computer games to the artificial intelligence experiments and most sophisticated electronic literature works, enable him to demonstrate the general value of the notion of expressive processing throughout the various cultural and academic fields. As such, this book is the perfect volume to begin the new publication series in the software studies. Rather than building the theory for software studies, it works as a model of how to do software studies. The wide variety of materials discussed, however, may be the Achilles’ Heel of the book. As we are all influenced by endless array of information technologies and their software processes,Expressive Processing is, in a way, including everybody in its audience. Still, restricting the target group by modestly limiting the topics covered might have made this book even better.
(Source: Author's introduction)
As scholars experiment with collaborative, multimodal approaches to analyzing electronic literature, the tools, methods, and practices of such collaboration become increasingly an issue. How do we share, edit, archive, and publish arguments that address and evolve across multiple types of data, platforms, and disciplines? How can the approaches (data visualization, code analysis, textual explication, bibliographic history, etc.) be shared in ways that other scholars can engage not just with the final interpretations but also with the processes that lead to them? Recent publications such as 10 PRINT CHR$ (205.5 + RND (1)); : GOTO 10, represent the value of such collaborative efforts in combining media archaeology, platform studies, software studies, and Critical Code Studies. Our own work in collaboratively close reading William Poundstone’s “Project for Tachistoscope: [Bottomless Pit],” which we presented at ELO 2010 (held at Brown University) and are now developing as a book for Iowa UP, has prompted us to reflexively consider how the processes of our own collaboration might prove generative to other scholars. Supported by an ACLS Collaborative Scholarship Fellowship 2012-2013, we are developing an open-access scholarly website to facilitate collaborative critical interpretations of digital art, a platform for digital humanities scholarship focused on born-digital poetics. The goal is to produce a workbench where scholars can apply critical tools to works of electronic literature and share the results of their investigations. We propose to present this website, in its nascent stages, and discuss its ambitions and affordances to producing complex, multimodal, and collaborative critical readings.
Computer people don't understand computers. Oh, they understand the technicalities all right, but they don't understand the possibilities. Most of all, they don't understand that the computer world is entirely built out of artificial, arbitrary constructs. Word processing, spreadsheet, database aren't fundamental, they're just different ideas that different guys have whomped up, ideas that could be totally different in their structure. But these ideas have a plausible air that has set like concrete into a seeming reality. Macintosh and Windows look alike, therefore that must be reality, right? Wrong. Apple and Windows are like Ford and Chevrolet (or perhaps Tweedledum and Tweedledee), who in their co-imitation create a stereo illusion that seems like reality. The computer guys don't understand computers in all their manifold possibilities; they think today's conventions are how things really are, and so that's what they tell all the new victims. So-called "computer literacy" is an illusion: they train you in today's strange conventions and constructs-- (Desktop? This to you looks like a desktop? A vertical desktop?) --and tell you that's what computers really are. Wrong. Today's computer constructs were made up in situations that ranged from emergency to academia, which have been piled up into a seemingly meaningful whole. Yet the world of the screen could be anything at all, not just the imitation of paper. But everybody seems to think the basic designs are finished. It's just like "Space, we've done that!" -- a few inches of exploration and some people think it's over.
In the old days, you could run any program on any data, and if you didn't like the results, throw them away. But the Macintosh ended that. You didn't own your data any more. THEY owned your data. THEY chose the options, since you couldn't program. And you could only do what THEY allowed you-- those anointed official developers.
The explosion of new ideas and methods in cultural disciplines from the 1960s did not seem to affect the presentation cultural processes in practice. Books and museums devoted to art, design, media, and other cultural areas continue to arrange their subjects into small numbers of discrete categories: periods, artistic schools, -isms, cultural movements. The chapters in a book and rectangular rooms of most museums act as material dividers between these categories. A continuously evolving cultural "organism" is forced into artificial boxes.
If architects adopted the techniques of computer graphics as theoretical terms to talk about their own field, we propose to do the same for all cultural fields. However, rather than only using these terms as metaphors, we also propose to visualize cultural processes using the same techniques.
The time has come to align our models and presentations of cultural process with the new design language and theoretical ideas made possible (or inspired) by software. For example, what will happen if we start conceptualizing and visualizing cultural phenomena and processes in terms of continuously changing parameters - as opposed to categorical boxes standard today?
Just as software substituted the older Platonic design primitives with new primitives (curves, flexible surfaces, particle fields) we propose to replace the traditional "cultural theory primitives" by the new ones. A 1D timeline becomes a 2D or 3D graph; a small set of discrete categorical boxes is discarded in favor of curves, freeform 3D surfaces, particle fields, and other representations available in design and visualization software.
O pessoal da informática não entende os computadores. Bem, eles entendem a parte técnica, sim, mas não entendem as possibilidades. Principalmente, eles não entendem que o mundo dos computadores é totalmente feito de arranjos artificiais e arbitrários.
Editor de textos, planilhas, bancos de dados não são fundamentais, são apenas idéias diferentes que diversas pessoas elaboraram, idéias que poderiam ter uma estrutura totalmente diversa. Mas essas idéias têm um aspecto plausível que se solidificou como concreto em uma realidade aparente. Macintosh e Windows são parecidos, portanto essa deve ser a realidade, certo?
Errado. Apple e Windows são como Ford e Chevrolet (ou talvez os gêmeos Tweedledum e Tweedledee), que em sua co-imitação criam uma ilusão que parece realidade.
This collection of short expository, critical, and speculative texts offers a field guide to the cultural, political, social, and aesthetic impact of software. Computing and digital media are essential to the way we work and live, and much has been said about their influence. But the very material of software has often been left invisible. In Software Studies, computer scientists, artists, designers, cultural theorists, programmers, and others from a range of disciplines each take on a key topic in the understanding of software and the work that surrounds it. These include algorithms; logical structures; ways of thinking and doing that leak out of the domain of logic and into everyday life; the value and aesthetic judgments built into computing; programming's own subcultures; and the tightly formulated building blocks that work to make, name, multiply, control, and interweave reality. The growing importance of software requires a new kind of cultural theory that can understand the politics of pixels or the poetry of a loop and engage in the microanalysis of everyday digital objects. The contributors to Software Studies are both literate in computing (and involved in some way in the production of software) and active in making and theorizing culture. Software Studies offers not only studies of software but proposes an agenda for a discipline that sees software as an object of study from new perspectives. Source: book presentation MIT Press
I learned that it is Tale-Spin's processes that have the literary value, creating a fictional world that gets its fascinating strangeness from taking a recognizable aspect of human behavior, exaggerating it, and stripping away almost everything else -- answering the question, "What would fiction look like if we accept the model of humanity being proposed by this kind of cognitive science?"
Why is there the stereotype that, while computer scientists and digital artists have much to discuss, digital humanists only want to talk about data mining with the former and data visualization with the latter?