Discover millions of ebooks, audiobooks, and so much more with a free trial

Only $11.99/month after trial. Cancel anytime.

Mainframe Experimentalism: Early Computing and the Foundations of the Digital Arts
Mainframe Experimentalism: Early Computing and the Foundations of the Digital Arts
Mainframe Experimentalism: Early Computing and the Foundations of the Digital Arts
Ebook597 pages8 hours

Mainframe Experimentalism: Early Computing and the Foundations of the Digital Arts

Rating: 3.5 out of 5 stars

3.5/5

()

Read preview

About this ebook

Mainframe Experimentalism challenges the conventional wisdom that the digital arts arose out of Silicon Valley’s technological revolutions in the 1970s. In fact, in the 1960s, a diverse array of artists, musicians, poets, writers, and filmmakers around the world were engaging with mainframe and mini-computers to create innovative new artworks that contradict the stereotypes of "computer art." Juxtaposing the original works alongside scholarly contributions by well-established and emerging scholars from several disciplines, Mainframe Experimentalism demonstrates that the radical and experimental aesthetics and political and cultural engagements of early digital art stand as precursors for the mobility among technological platforms, artistic forms, and social sites that has become commonplace today.



This title is part of UC Press's Voices Revived program, which commemorates University of California Press's mission to seek out and cultivate the brightest minds and give them voice, reach, and impact. Drawing on a backlist dating to 1893, Voices Revived makes high-quality, peer-reviewed scholarship accessible once again using print-on-demand technology. This title was originally published in 2012.
Mainframe Experimentalism challenges the conventional wisdom that the digital arts arose out of Silicon Valley’s technological revolutions in the 1970s. In fact, in the 1960s, a diverse array of artists, musicians, poets, writers, and filmmakers ar
LanguageEnglish
Release dateSep 1, 2023
ISBN9780520953734
Mainframe Experimentalism: Early Computing and the Foundations of the Digital Arts

Related to Mainframe Experimentalism

Related ebooks

Art For You

View More

Related articles

Reviews for Mainframe Experimentalism

Rating: 3.5 out of 5 stars
3.5/5

2 ratings0 reviews

What did you think?

Tap to rate

Review must be at least 10 words

    Book preview

    Mainframe Experimentalism - Hannah Higgins

    MAINFRAME EXPERIMENTALISM

    MAINFRAME

    EXPERIMENTALISM

    Early Computing and the Foundations

    of the Digital Arts

    Edited by

    Hannah B Higgins and Douglas Kahn

    UNIVERSITY OF CALIFORNIA PRESS

    Berkeley Los Angeles London

    University of California Press, one of the most distinguished university presses in the United States, enriches lives around the world by advancing scholarship in the humanities, social sciences, and natural sciences. Its activities are supported by the UC Press Foundation and by philanthropic contributions from individuals and institutions. For more information, visit www.ucpress.edu.

    University of California Press

    Berkeley and Los Angeles, California

    University of California Press, Ltd.

    London, England

    © 2012 by The Regents of the University of California

    Library of Congress Cataloging-in-Publication Data

    Mainframe experimentalism: early computing and the foundations of the digital arts / edited by Hannah B Higgins and Douglas Kahn.

    p. cm.

    Includes bibliographical references and index.

    ISBN 978-0-520-26837-1 (cloth: alk. paper)

    ISBN 978-0-520-26838-8 (pbk.: alk. paper)

    1. Art and computers. 2. Digital art. 3. Arts, Modern—20th century. I. Higgins, Hannah, 1964- editor. II. Kahn, Douglas, 1951- editor.

    NX180.C66M35 2012

    776.09'046—dc23

    2011049953

    21 20 19 18 17 16 15 14 13 12

    10 987654321

    For James Tenney» mainframe experimenter» composer,

    inspirer» friend

    CONTENTS

    CONTENTS

    ILLUSTRATIONS

    ACKNOWLEDGMENTS

    INTRODUCTION

    1.THE SOULLESS USURPER. Reception and Criticism of Early Computer Art

    2. GEORGES PEREC’S THINKING MACHINES

    3.IN FORMING SOFTWARE. Software, Structuralism, Dematerialization

    4.INFORMATION AESTHETICS AND THE STUTTGART SCHOOL

    5. THEY HAVE ALL DREAMT OF THE MACHINES—AND NOW THE MACHINES HAVE ARRIVED New Tendencies—Computers and Visual Research, Zagreb, 1968-1969

    6.MINICOMPUTER EXPERIMENTALISM IN THE UNITED KINGDOM FROM THE 1950S TO 1980

    7.JAMES TENNEY AT BELL LABS

    8.HPSCHD-GHOST OR MONSTER?

    9. THE ALIEN VOICE Alvin Lucier’s North American Time Capsule 1967

    10. AN INTRODUCTION TO NORTH AMERICAN TIME CAPSULE 1967

    11.NORTH AMERICAN TIME CAPSULE 1967

    12.AN INTRODUCTION TO ALISON KNOWLES’S THE HOUSE OF DUST

    13. THE BOOK OF THE FUTURE Alison Knowles’s The House of Dust

    14. THREE EARLY TEXTS BY GUSTAV METZGER ON COMPUTER ART

    15. COMPUTER PARTICIPATOR Situating Nam June Paik’s Work in Computing

    16.FIRST-GENERATION POETRY GENERATORS Establishing Foundations in Form

    17. TAPE MARK I

    18.LETTER TO ANN NOËL

    19. THE COMPUTATIONAL WORD WORKS OF ERIC ANDERSEN AND DICK HIGGINS

    20. OPUS 1966

    21. COMPUTERS FOR THE ARTS (MAY 1968)

    22. THE ROLE OF THE MACHINE IN THE EXPERIMENT OF EGOLESS POETRY Jackson Mac Low and the Programmable Film Reader

    23.STAN VANDERBEEK’S POEMFIELDS The Interstice of Cinema and Computing

    24. FROM THE GUN CONTROLLER TO THE MANDALA The Cybernetic Cinema of John and James Whitney

    INDEX

    ILLUSTRATIONS

    o.i. James Tenney and Lejaren Hiller, Electronic Music Studio, University of Illinois, ca. 1961 6

    1.1. United States Army Ballistic Research Laboratories, Splatter Diagram, 1963 21

    1.2. Manfred Mohr, p.159 rs, 1974 25

    1.3. Sol LeWitt, Variations of Incomplete Open Cubes (Photographic Component), 1974 26

    2.1. Georges Perec’s French flowchart, from Georges Perec: A Life in Words 44

    2.2. Perec’s English flowchart 45

    3.1. Hans Haacke, News, 1969 56

    4.1. Georg Nees, 23-Ecken, 1964 70

    4.2. Siegfried Maser, Kybernetisches Modell ästhetischer Probleme,

    1974 73

    4.3. Frieder Nake, Refined Model for the Aesthetic Process, 1974 76

    4.4. Nake, Flowchart of the Program Package COMPART ER 36,1974 77

    4.5. Manfred Mohr, p.050/R, "a formal language,ˮ 1970 80

    5.1. New Tendencies 4 installation, 1969 104

    6.1. John Lansdown with dancers, 1969 114

    8.1. John Cage and Lejaren Hiller, detail, Program (KNOBS) for the Listener, Output Sheet No. 10929,1967-69 153

    12.1. Alison Knowles, Poem Drop, 1971 198

    13.1. Knowles, The House of Dust, 1967/2007 201

    15.1. Nam June Paik, The First Snapshots of Mars, 1966 233

    16.1. Emmett Williams, IBM, in A Valentine for Noël, 1973 255 xli ILLUSTRATIONS

    16.2. Marc Adrian, illustration for Computer Texts, 1968 257

    16.3. Margaret Masterman and Robin McKinnon Wood, illustration for

    Computerized Japanese Haiku, 1968 259

    17.1. Nanni Balestrini, Tape Mark 1,1961 269

    19.1. Dick Higgins, Hank and Mary: A Choral for Dieter Rot: Computers for the

    Arts, 1968/1970 286

    20.1. Eric Andersen, Opus 1966 289

    20.2. Andersen, fragment of Opus 1966 290

    23.1. Stan VanDerBeek, still from Poemfield No. 2,1966 312

    23.2. VanDerBeek, still from Poemfield No. 1,1965 319

    23.3. VanDerBeek, Poemfields poster, 1966 325

    24.1. James Whitney, Lapis, 1966 335

    24.2. John Whitney, Permutations, 1968 345

    ACKNOWLEDGMENTS

    This book is dedicated to James Tenney, whom we would like to thank posthumously for his role in a 2002 symposium we held at the University of California, Davis. The topic of the symposium was the computer programming workshop he had held in 1967 in the living room of Dick Higgins and Alison Knowles, publishers of Something Else Press in New York City and parents of Hannah Higgins, coeditor of this book. Knowles was the other main speaker at the symposium, for her The House of Dust was generated in the context of the workshop and with Tenney s ongoing assistance. Like several artists whose work with early computers is at the core of this book, we were the beneficiaries of Tenney’s generosity and expertise. This book would be, literally, unimaginable without him.

    Several editors from the University of California Press attended our symposium. Among them were Stephanie Fay and Deborah Kirshman, who expressed an immediate interest in the project. At the time, we imagined a very modest publication, but we soon came across a wealth of unexplored topics, source materials, and a new generation of scholars. We sought out authors and artists internationally, suffered several setbacks, and pulled back when the project exploded in size. This critical process required the gift of patience from our writers and artists and the Press. By 2008, it looked as if we were ready to go, but the copyright clearances have been complex, involving estates and long-since defunct publications. We are especially grateful for the generous granting of rights by the artists, writers, and estates involved. Finally, the astute guidance of Mary Francis and Eric Schmidt in helping to complete the publication stage should not go unacknowledged. Thank you.

    Xlv ACKNOWLEDGMENTS

    Over the ensuing decade, our assistants worked tirelessly to pull this anthology together from Davis, Sydney, Washington, DC, and Chicago. Special thanks go to them for managing a complex project with grace. Douglas Kahns assistant, Nilendra Gurusinghe, at the University of California, helped with the early stages of the project; Nathan Thomas, Higgins’s research assistant at the University of Illinois, Chicago, donned a detective hat and had a keen eye for detail in managing the text and rights issues and formatting as the book was prepared for publication; and Peter Blamey in Sydney saw the project through its final stages. Generous grants from the National Institute for Experimental Arts, College of Fine Arts, University of New South Wales, and the University Scholar program at the University of Illinois, Chicago made the indexing and the completion of the book possible.

    Finally, we would like to thank our children and spouses for their continued patience, support, and daily inspiration.

    INTRODUCTION

    Hannah B Higgins and Douglas Kahn

    What we need is a computer that… turns us… not on but into artists.

    JOHN CAGE, 1966

    Mainframe Experimentalism is a collection of essays and documents on the encounter with mainframe and minicomputers by artists, musicians, poets and writers, and filmmakers in and around the 1960s. The time frame for our book begins with the era of room-size mainframe computers in the late 1950s, extends through the 1960s’ refrigerator-size, transistor-operated minicomputers and institutionally bound digital technologies, and ends in the 1970s with the transition to microcomputers, the desktop computers that paved the road for today’s ubiquitous digital devices. This duration, the long 1960s, was a time when simple access to computers was determined by institutional rather than consumer logics. These institutions inhered to geopolitical, military, corporate, and scientific priorities that were not immediately or obviously amenable to the arts. For those artists lucky enough to find access to these computers, technical requirements mandated the expertise of engineers, so the process was always collaborative, yet rarely sustainable over any great length of time. Thus, while mainframes and minis grew at the core of major institutions, they played contingent and fleeting roles in artistic careers. It is a testament to all involved that so much was attempted and achieved within these constraints.

    With Mainframe Experimentalism, we are attempting to bring a new focus on a range of these artistic activities. The vitality and achievement across the arts of the 1960s are highly prized by today’s critics, curators, historians, and collectors; a quick perusal of garden-variety museums of contemporary art demonstrates as much, since the decade that brought us pop, conceptualism, Fluxus, Happenings, video art, and minimalism is seen as the foundation for the arts of the present. The same can be said of 1960s literature, music, and film. In contrast, the mainframe- and mini-based ancestors of todays digital art are generally remembered with some embarrassment. Whether because of a lack of exposure to actual computers, the absence of a critical apparatus that understood what was at stake in computer-based artwork, or the overabundance of geeky stereotypes, public perception of early computing and the arts did not fare well, as is recounted in tragic detail in Grant Taylor’s contribution to this volume. If anything, firstgeneration computer art was and has been synonymous with bad art or, more generously, an immature or technologically defined aspirant art. It was associated with engineers with artistic aspirations or artists with engineering aspirations, rather than being seen as a technology that had fused with artistic practice and achievement, such as the chemistry of oil paints or the mechanics of a piano.

    The artist Jim Pomeroy spoke for many when he lamented the look of computer art… flashy geometric logos tunneling through twirling wire-frames,’ graphic nudes, adolescent sci-fi fantasies, and endless variations on the Mona Lisa.¹ When it was not demonstrating new graphical interface capabilities or plotting information, computer art often meant emulations of the art historical canon, especially modern genres that already trafficked in simplified graphics. There appeared to be some artistic rationale, especially in the 1960s, in the way the computer provided a means to distance the maker from artistic authorship and thus from the perceived excesses of expressionism, yet this distancing could also operate in too close a proximity to the technocratic drive and bureaucratic numbing of both capitalism and the Eastern Bloc.

    Historians of fine art, media, and literature have historically avoided serious investigation of the digital arts of this period, and only a modicum of attention has been paid in musicology. The early digital arts seemed degraded as art; they seemed to be about workshopping technological possibilities or, in the case of musicology, seemed constrained to academic computer music. More recently, histories of digital arts have been channeled through prescient moments in the development and social uptake of digital technologies. Where once commentators followed the lines of Miró or Klee and found digital art wanting, later historians were drawn to the lines of Douglas Engelbart’s mouse; where once canonical artists were housed in the domain of museums of modern art and the commodity culture of collectors, they were now housed in the computer architectures of John von Neumann and Silicon Valley design centers. The vanguard art world that New York had stolen from Paris during World War II had been digitally rerouted to Palo Alto, if we follow the migrations of the discourse.

    With Mainframe Experimentalism, we attempt to reestablish in their own right the efforts, ideas, and achievements of artists, musicians and writers working during the digital period of the long 1960s, many of whom have been flanked by strictures of computer art on the one hand and diffusion into new media and digital culture on the other. As experimenters well rehearsed in provision- ality, these artists were able to operate within inhospitable institutional and economic conditions, negotiate social networks and knowledge competencies, and meet the narrow constraints of computational tools with a sense of possibility. They did so as the exigencies of the military, bureaucratic, and corporate entities that controlled computer access ran up against the grassroots and collective concerns of the bohemian, countercultural, antistate, and antiwar constituencies of the 1960s.

    In addition, we wish to show that the radical and experimental aesthetics and political and cultural engagements of the period, across conventional disciplines and media, by an international array of individuals, groups, and institutions can stand as a historical allegory for the mobility among technological platforms, artistic forms, and social sites that has become commonplace today. And, in this way, what is detailed in these pages is the formation of the digital arts.

    In 1966, the composer John Cage posed a rhetorical question from which the opening sentence of this introduction was taken: Are we an audience for computer art?

    The answer’s not No; it’s Yes. What we need is a computer that isn’t labor-saving but which increases the work for us to do, that puns (this is [Marshall] McLuhan’s idea) as well as Joyce revealing bridges (this is [Norman O.] Brown’s idea) where we thought there weren’t any, turns us (my idea) not on but into artists.²

    In place of the mass efficiency normally presumed to append to computers, here we see Cage bearing witness to the labor required of the artist as well as the engineer, which yields unexpected puns and bridges and, most important of all, creates the mainframe artist—the person creatively engaged with an emerging technology. Cage had an abiding interest in technologies to disclose aspects of sound and its potentials, whether it was his early use of radios and recorded sound or the specialized scientific space of the anechoic chamber.

    More than anyone else, Cage became associated with experimentalism, first in music and then across the ranks of the arts following his time at Black Mountain College in the early 1950s, his classes at the New School for Social Research in the late 1950s, and his prodigious national and international touring. Key to the term experimentalism, as understood by Cage, was the unpredictability of outcome, which could be based in new technology or virtually any other process that removed the author ‘s choice from the composition process. He explained, The word ‘experimental’ is apt, providing it is understood not as descriptive of an act to be later judged in terms of success and failure, but simply as of an act the outcome of which is unknown.³

    Mainframe Experimentalism is the unexpected outcome of a collaborative investigation by the editors into a little-known computer programming workshop that the composer James Tenney held in New York City for his friends associated with experimental music and art scenes. From 1961 to 1964, Tenney had worked on computer music and psychoacoustics at Bell Labs as a resident artist of sorts, and then at the Polytechnic Institute of Brooklyn as a researcher. Using the example of a study group that John Cage had held on the writings of Buckminster Fuller during the summer of 1967, Tenney decided to demystify and share his programming skills with his friends. The workshop occurred some time in the fall of 1967, and his friends happened to be a veritable who’s-who of the experimental arts scene in New York: Phil Corner, Dick Higgins, Alison Knowles, Jackson Mac Low, Max Neuhaus, Nam June Paik, and Steve Reich. The workshop was held in the offices of Something Else Press in Chelsea, the home of Knowles and Higgins. As Reich himself would later write, If you think this sounds like a strange context in which to study FORTRAN, you’re right.

    There is only the most scattered and fleeting mention of this workshop in the historical record and, unfortunately, not much of a paper trail and few memories among the participants of what transpired. Even historians specializing in the experimental arts were unfamiliar with the workshop. Tenney’s FORTRAN workshop was, of course, not an isolated activity but part of a wider presence of experimental arts within early computing. New York was in communication with important centers in Stuttgart, Zagreb, London, and Los Angeles. Tenney and Knowles both would be included in Jasia Reichardt’s famous Cybernetic Serendipity exhibition, which premiered at the Institute of Contemporary Arts in London in 1968. The show also included London-based Gustav Metzger, Californian John Whitney, German Frieder Nake, and many others. Mainframe Experimentalism developed through our investigations into these activities and our sense that, while much important work had been done, especially by the younger generation of historians, much was still neglected and a better overall picture should be developed.

    Moving beyond the confines of New York required an expanded sense of the term experimental. E.H. Gombrich, in The Story of Art, characterizes the first half of the twentieth century by the term Experimental Art. In this broader sense, innovations in process and material involve an experimental attitude linking the futures of art to the past through a changing sense of art as linked to artists’ everchanging worlds. In words that echo Cage’s admonition that the computer not merely entertain but turn us … not ‘on’ but into artists, Gombrich’s 1965 Postscript ascribes to artists and critics a healthy belief in experiments and a less healthy faith in anything that looks abstruse.⁵ Despite his cautionary tone and a clear distaste for virtually everything from abstract art onward, Gombrich’s use of the term experiments coincides precisely with the sensibility of the early digital artists described in this book who keep an open mind and give a chance to new methods which have been proposed.⁶ Referring to the fifteen-year interval since The Story of Art was first published, he describes major technological changes as affecting the artists’ landscape: There was no jet travel when this book came out, no transistor radios, no artificial satellites, and computers were scarcely on the drawing board.

    Another sense of experimental was derived from science and was more closely tied to a specifically technological elaboration of the arts. From 1955 to 1957, tejaren Hiller, a former chemist, used a mainframe at the University of Illinois to collaborate with Leonard Isaacson on the composition Illiac Suite for String Quartet. The vacuum-tube computer, weighing five tons and with miniscule memory, was used to generate the notation rather than synthesize the sounds, and the overriding purpose of the project was to technologically regenerate existing musical aesthetics, as was described in Hiller and Isaacson’s 1959 book, Experimental Music: Composition with an Electronic Computer.⁸ The Electronic Music Studio he established at the University of Illinois became the training ground for James Tenney as a graduate student (figure 0.1), and, as Hiller’s own aesthetics become more radical, he would collaborate with John Cage on the composition and multimedia extravaganza HPSCHD (1969), the title itself in programming format. Ironically, through his work on HPSCHD, Cage would discover how much labor was involved in working with computers, and, as Branden W. Joseph points out in his essay in this volume, the piece would stand as the apotheosis of Cage’s long-running pursuit of the most advanced technologies or, at the very least, of his doing so on such an ambitious scale.

    Mainframes ran on vacuum tubes and filled entire rooms. They were much too expensive for ownership by anyone except state and military agencies, large corporations, universities, and research centers. They came equipped with their own institutional logic and list of priorities on which the arts did not rank at all. Moreover, they were born from the military exigencies of the Manhattan Project, the IBM punch cards of Holocaust administration, and the ballistic tests and administrative ranks of the Cold War. The American writer Kurt Vonnegut wrote a satirical story in 1950 about the implausibility of human concerns amid such institutional and material culture. The story was centered on a mainframe designed for military purposes that found it had a talent for writing love poetry. The computer was EPICAC, a name punned from ENI AC, the first famous mainframe, and the emetic ipecac. The real ENIAC cost approximately $500,000 in the World War II dollars that paid $2 billion for the Manhattan Project, and covered less than 700 square feet. The fictional EPICAC cost $776,434,927.54 in 1950 dollars, covered over 40,000 square feet, and took up "about an acre on the

    FIGURE o.i. James Tenney (seated) and Lejaren Hiller at the Electronic Music Studio, University of Illinois, ca. 1961.

    fourth floor of the physics building at Wyandotte College, weighing in at seven tons of electronic tubes, wires, and switches, housed in a bank of steel cabinets and plugged into a 110-volt A.C. line just like a toaster or a vacuum cleaner."

    Unlike a toaster, its main job was to plot the course of a rocket from anywhere on earth to the second button from the bottom of Joe Stalin s overcoat, if necessary. Or, with his controls set right, he could figure out supply problems for an amphibious landing of a Marine division, right down to the last cigar and hand grenade.⁹ There was a problem for which it had no solution; although it could write reams of love poetry, it would never have the protoplasm required to actually love. It self-destructed trying to compute its way around that fate. By 1968, another famous mainframe with personality, the HAL 9000 in Arthur C. Clarke s and Stanley Kubrick s book and film, 2001: A Space Odyssey, had no such internal conflict.

    The uses of real mainframes and minicomputers were dictated by cost-benefit factors dictated by military, intelligence, corporate, logistic, and administrative priorities. None of this was lost on the artist Gustav Metzger, who wrote in 1969:

    The first large electronic computers, the ENIAC and EDVAC, were developed under the pressure of the second world war. It is said that the complex calculations needed to show whether a hydrogen bomb was possible took six months on an electronic computer. Without the computer the calculations might never have been made. (J.G. Crowther, Discoveries and Inventions of the 20th Century, London, 1966, p. 68) Let us switch attention to an area that is of particular interest to this Symposium—computers and graphics. The first display devices were special purpose types primarily provided for the military. (Computer News, St. Helier, [sic] Jersey. V. 12, No. 11, November, 1968, p. 3.) As you will know, the first prizewinners in the now annual computer art contest held by Computers and Automationwere from a U.S. ballistic group. There is little doubt that in computer art, the true avantgarde is the military.¹⁰

    The capability to which Vonnegut alluded, of pinpointing a specific button on Stalin’s coat, had its precursor during World War II in Norbert Wiener’s mathematical work on targeting mechanisms for antiaircraft guns, systems work that led to Wiener’s development of cybernetics. The animator, inventor, and computer graphics pioneer John Whitney Sr. worked in a Cold War setting in the early 1950s on films on guided missile projects at Douglas Aircraft, and in the late 1950s he sourced parts from World War II antiaircraft directors in his analog computer for making abstract animations. The films produced on this device entranced thousands of habitués of the counterculture and became stock-in-trade in Gene Youngblood’s notion of expanded cinema.¹¹ It would also pave the way to move from analog to digital computing during Whitney’s residency at IBM.

    James Tenney mused that in a better world the arts, rather than military and intelligence projects, would be first priority for mainframe computers, and he held no illusions that classified projects elsewhere at Bell Labs were about to selfdestruct in EPICAC fits of love. At the same time, Tenney also recognized that his boss at Bell Labs during the early 1960s, the telecommunications engineer John Pierce, misrepresented the true nature of Tenney’s position to his own bosses in order to carve out an institutional space for Tenney to compose. Still, as the 1960s progressed, antiwar sentiments intensified and institutional clashes became more pronounced. In 1967, not long after the FORTRAN workshop, while addressing faculty members and representatives from the military at the Polytechnic Institute of Brooklyn, Tenney chose to play his Fabric for Ché, based on Che Guevara, who had been captured and killed in Bolivia the same year, as his way to say something about my disgust about the war in Vietnam.¹²

    As Edward Shanken describes in his essay in this volume, Jack Burnham curated the important exhibition Software at the Jewish Museum in New York in 1970, after having been in residence at the Center for Advanced Visual Studies at MIT during 1968-69, where he worked with a time-sharing computer. The artist Hans Haacke contributed a work that used a computer on loan to the museum to profile the visitors to the exhibition via such questions as Should the use of marijuana be legalized, lightly or severely punished? and Assuming you were Indochinese, would you sympathize with the present Saigon regime? However, the computer failed to function. Indeed, the wider availability of minicomputers was concurrent with increasing criticism of corporate profiting in the Vietnam War, pervasive militarism and nuclear proliferation, and technocratic rationalism.

    As an emergent technology, the mainframe computer was a different beast altogether from any technology that artists had worked with in the first half of the twentieth century. For example, the technology of cinema was expensive and access to it was difficult, but its challenges were very modest in comparison to those of mainframe computing; likewise, cinema offered an expansively mimetic palette, whereas computing could barely scratch out its results at the end of a laborious process. Scarce opportunities to work with computers seemed to be slanted in favor of the machines. According to a Time magazine review, despite an audience of forty thousand visitors, Cybernetic Serendipity was a show of One Hand Clapping. Even at its best, the show proves not that computers can make art, but that humans are more essential than ever.¹³

    Indeed, the visual arts and visual research in early computing constituted a particularly volatile mix of influences and conditions. The information aesthetics of Max Bense in Germany exerted a profound influence, as Christoph Klütsch discusses in his thorough essay in this volume on activities in Stuttgart, and they helped usher aesthetics onto a ground of mathematics and technics conducive to computation. The same was true with Abraham Moles and aesthetic perception. While lending concretely to developments in graphic design and visual display, some computer art encountered difficulty once it left research labs and began interacting with other arts amid the intensity of the cultural and political attitudes of the 1960s, even as it aligned itself with constructivist and abstract tendencies that may themselves have had their own culturally and politically formative moments.

    The value of early computer art, in other words, lay not in where it had come from or what it might become, but in the contestation of what it actually was at a specific historical moment. This contested territory is described clearly in Margit Rosen’s essay in this volume on the New Tendencies group in Zagreb. The group, founded in 1961 to exhibit work that distanced the artist from the creative process through rational, technological, or procedural means, mounted a 1968-69 colloquium, exhibition, and symposium concerned with computers as tools in visual research through which, in Rosen’s words, art was intended to become radically intersubjective, communicable, comprehensible, and reproducible, like a scientific experiment.

    Because of an affinity with the numbers and letters of programmed systems, mainframe music and poetry were more easily aligned with the progressive aesthetics of the time. Technological controls were conducive to the simplicity of code and latitude of text and musical sound, and, in fact, similar procedures had already been exercised in the histories of formal literature and music. At least since the time of Gottfried Wilhelm Leibniz and the musical dice of Mozart, combinatory and chance processes had provided unpredictability, variation, possibility, and boundless plenitude. In the post-World War II period, information theorists wielded knowing references to the writings of modernist authors such as James Joyce and Gertrude Stein to demonstrate in language the entropie edge of originality versus banality, information versus redundancy, and so forth. Indeed, the technique that Claude Shannon used to demonstrate his Mathematical Theory of Communication (1949) could have easily leapt off the pages of the experimental poetry of the 1950s and ‘60s: One opens a book at random and selects a letter at random on the page. This letter is recorded. The book is then opened to another page and one reads until this letter is encountered. The succeeding letter is then recorded. Turning to another page this second letter is searched for and the succeeding letter recorded, etc.¹⁴

    Many of the artists discussed in this book worked with methods and systems that generated unforeseeable results and distanced the subject from authorship. John Cage and Jackson Mac Low both used chance operations and other formal methods long before they worked with computer systems to expedite and manage larger projects, even if, as was the case with Mac Low, the source was the RAND Corporation computer-generated book A Million Random Digits with 100,000 Normal Deviates. Similarly, the artist and poet Emmett Williams developed the rules of the game in 1956 for his poem IBM, completed a decade later. Christopher Funkhouser discusses Williams, among many others, in his valuable survey included in the present volume.

    One of the most distinguished poetical works from the period of mainframes, Tape Mark I (1961), by the Italian poet and novelist Nanni Balestrini, is included in this volume in a new translation with program notes by Staisey Divorski. Balestrini understands this work within a tradition starting in the modernist techniques of Mallarmé and Raymond Roussel, yet it is very much in its own time, interlacing phrases from Lao Tzu’s Tao te ching, a pulp detective novel, and Michihiko Hachiya’s Hiroshima Diary. Likewise, in 1959, Theo Lutz, a student of Max Bense in Stuttgart, culled passages from Franz Kafka’s The Castle that put the machine processing of probabilistic techniques in parallel with the routing systems of social control described in the novel.

    Georges Perec’s The Art of Asking Your Boss for a Raise follows the recursive maze in the frustrating corridors of modern bureaucracy that one must go through to get denied a raise in pay. Perec’s thinking machine was built in close proximity to computers, but never in the same room; the layout was instead manifested in computer-friendly algorithmic narrative and flowchart. Perec was a member of Oulipo, the small organization of writers and mathematicians born from the haunches of Alfred Jarry and the Collège de ‘pataphysique. Oulipo’s interest in computing and literature has appropriately received attention among recent writings on the digital arts; we are pleased to add Bellos’s essay to this understanding (see chapter 2).¹⁵

    In 1966, the physicist and computer display researcher Ken Knowlton worked with one of the quintessential ’60s artists, Stan VanDerBeek, at Bell Labs. Along with A. Michael Noll, the trials of whom Grant Taylor has described (see chapter 1), Knowlton’s early graphic works were synonymous with computer art. The art historian Johanna Drucker is candid in her assessment: The contents of works produced by Ken Knowlton at Bell Labs are shockingly dull images of gulls aloft or a telephone on a table or a female nude rendered in symbol set. Drucker qualifies her statement by reminding us, As technological developments have stabilized, the experimental features of earlier works have become harder to appreciate for the contributions they made at the time.¹⁶ In her essay for this volume, Gloria Sutton describes in precise detail the factors feeding the visual language of Stan VanDerBeek’s computer-animated Poemfields, a new type of animated film that presented poetry in the interstice between cinema and computing, a project he worked on with Knowlton at Bell Labs beginning in 1966. VanDerBeek compared the labor and frustrations to learning how to draw by pushing a pencil around with your nose.

    The traffic between artists and the mainframes at Bell Labs began musically, because of the musical interests of the engineers John Pierce and Max Mathews, and the acoustical and psycho-acoustical needs of the telephone and telecommunications industry. The composer James Tenney, following Edgar Varèse’s enthusiasm for the artistic possibilities of new technologies, sought out the Electronic Music Studio of Lejaren Hiller as a graduate student, and then from 1961 to 1964 was the first person to compose digitally synthesized sound into a sustained body of work. The length of his stay allowed him to develop programming skills and creativity that were otherwise reserved for engineers at the time. Through the 1960s, as engineers from Bell Labs became more active in the New York art world, especially through the organization Experiments in Art and Technology, Tenney connected his friends with Bell Labs, and with the potential of computing in general, through personal introductions and his FORTRAN workshop.

    Among the artists in the FORTRAN workshop was Nam June Paik, one of the irrepressible spirits of experimentalism. Tenney remembered, Nam June Paik used to call me his guru! He liked playing with hardware and here I was talking software.¹⁷ In the fall of 1967, Paik began working as a residential visitor at Bell Labs with the assistance of A. Michael Noll, and he began transforming earlier ideas about the cybernetics of Norbert Wiener and the media theory of Marshall McLuhan into the difficult practicalities of computing. In fact, the most tangible evidence of his residencies was a modest computer text entitled Confused Rain, in the spirit of Guillaume Apollinaire’s calligramme II Pleut.

    Max Mathews of Bell Labs has quoted John Cage, who said that if you are surprised with the result, then the machine has composed the piece. If you are not surprised, then you have composed it. I found out, however, that no matter how genial a computer might be, he has no common sense. For example, instead of just saying Walk, you have to break it down to logical steps, that is, give the weight to the left half of your body, give strength to the muscles below the knee, put the energy to the vector pointing to the sky, making 90 degrees to the earth, move the vector to 160 degrees to the earth, give the energy to the leg in the direction of the earth, using also universal gravitation, stop the movement as soon as the distance between your leg and the earth comes to zero, repeat the above process for your right leg, the right leg meaning your leg on the right side of your body, then repeat the entire process 100 times.

    I decided to title all my computer pieces in French, to protest the lack of common sense in the computer. Verlaine wrote: It rains in my heart, as it rains in the city. I say: It rains in my computer, as it rains in my heart—"!/ pluit dans mon computeur>f will be my first piece. It is the mix of real rain and simulated rain in the computer. My second piece will be called aLa Computeur sentimentale," and the third piece, Aimez-vous FORTRAN -programming!

    The more it deals with the character of randomness and repetition, the more efficient is the computer. These are the two poles of human artistic materials. Total repetition means total determinism. Total randomness means total indeterminism. Both are mathematically simply explicable. The problem is how to use these two characters effectively. Therein lies the secret for the successful usage of the computer in the creative arts.¹⁸

    Paik later would refocus his technological efforts on the more immediate results obtained by video.

    In another institutional setting, the computer and engineers at Regnecentralen in Denmark allowed plenty of bit-room for the deeply serious and comedic artist Eric Andersen, also a Fluxus participant (see chapter 19). In his work Opus 1966, the poetical tasks of daily life began to be addressed to a computational frame: Life itself is often defined not only as a phenomenon that reproduces itself but more importantly as a process that allows random mutations to take place. The actual machine behind this sentiment differed from the labor-saving device that many imagined the computer would become. Any such domestic appliance could exist only at multiple levels above the architects of computer programming languages, who argued over punctuation and other functional issues of symbolism; even the poetic constraint in the language of Opus 1966 would have been Olympian among the engineers at Andersen’s side. The interaction of actual engineers with artists is often left unacknowledged or represented romantically and, to counter this, Andersen cofounded the Union to Protect Data Processing Systems and Machines from Tedious Work.

    In the experimental aesthetic, tedium and boredom need not be avoided. Fluxus and intermedia artist and theorist Dick Higgins described the fruits of boredom in his classic essay Boredom and Danger, and Claes Oldenburg really wanted to agree: Boredom is beautiful but it is hard to keep awake.¹⁹ The work of material and social interactions does not have to be yoked to an image of mechanically induced labor, and when Cage hoped for a computer that increases the work for us to do, he did not necessarily mean work at the computer. Experi- mentalism focused its aesthetics and poetics on the everyday, on the seemingly banal dismissed in information aesthetics as redundancy and equilibrium, by creating the conditions for an act, the outcome of which is unknown … anyone has the opportunity of having his habits blown away like dust.²⁰

    Alison Knowles’s The House of Dust began with what she acknowledged as the generosity of James Tenney’s presentation of the arcane and alien workings of FORTRAN to his friends. It seemed to fulfill Cage’s criterion of revealing bridges … where we thought there weren’t any by making the transit and translation from one social site to another, from engineering to the arts, from one language to another. In their respective contributions to Mainframe Experi- mentalism, Benjamin Buchloh and Hannah Higgins follow the probable and impossible houses and inhabitants of The House of Dust from their initial linguistic incarnation in computer printouts, the pages of Cybernetic Serendipity and Fantastic Architecture, to materialization in actual structures, performances, and social occasions that reiterate Tenney’s initial act of generosity and interpersonal interaction.²¹ The House of Dust would surely be one of the masterworks of the arts and computation, of experimental arts, and, indeed, of the art of the period if not for the fact that it intrinsically resists the very notion of mastery.

    The key to Alison Knowles’s going beyond the technological limits of digital computing was her placing of the dedicated output, i.e., the printout of text, amid the ever-changing contingencies of social and poetical practice. This was neither an instrumentalist use of the device nor a formal introduction of chance into otherwise normal operations. Alvin Lucier, in his composition North American Time Capsule 1967, went beyond technological constraint by cracking open the device, in this case the circuits of the Vocoder at Sylvania Applied Research Laboratories, to intercede between normal input and output. The Vocoder was a digital device designed originally at Bell Labs to carry greater amounts of vocal information through the bandwidths of telecommunications infrastructures. As Christoph Cox writes in his essay (chapter 9), Lucier concentrated on the intermediary ability of the device for vocal analysis rather than the input-output repetition of synthesis, processed in the electronic dynamics somewhere between digital encoding and decoding, such that the vocoder becomes a machine with which to liquidate speech and to abolish the identity of the speaking subject, shattering all syntax and pulverizing every symanteme, morpheme, and phoneme into fluid sonic matter. This was characteristic of the experimental approach within electronic music that discovered generative moments and music where others would find bugs and noise, a welcomed embrace of what the composer Pauline Oliveros calls the negative operant phenomena of systems.²²

    As minicomputers became more common in universities and technical colleges during the 1960s, these institutions increasingly became arts patrons. This is described very well in Charlie Gere’s essay in this volume on minicomputer experimentalism in the United Kingdom. Gere coedited what should be considered a companion volume to the present effort: White Heat Cold Logic: British Computer Art 1960-1980.²³ Artists sought out pieces of the infrastructure dedicated to their mission, even if that meant time-sharing in the early morning hours and on weekends. Increasingly, in the latter half of the 1960s, computer corporations and corporations housing computers lent machine time for exhibitions and intermittent artistic research. But not until the mid-1970s, when microprocessors were fitted into preassembled, consumer-level computers such as the Altair 8800 and the KIM-i, did computation really begin to make its big leap from restricted or stigmatized institutional access to the kitchen table and consumer culture.

    Conditions have indeed changed. The composer John Chowning has calculated the relative cost of computing, based upon the cost of memory, between an IBM 7090 in 1967, a year of much of the activity chronicled in this book, and his laptop in 2007 (with its 3GB memory and 100GB disk storage) and found that the IBM would in 2007 be worth twelve cents, whereas his laptop would have in 1967 been worth $58,959,960,937.50, far exceeding the outrageous expense of the lovelorn EPICAC.²⁴ The breadth of artistic and cultural practice has likewise increased exponentially and continues to evolve at a rapid pace on an increasing number of digital devices. No matter how accelerated the growth, many of the aesthetic, poetic, social, and political seeds of the digital arts are to be found within Mainframe Experimentalism.

    NOTES

    John Cage, Diary: Audience 1966, in A Year from Monday (Middletown, CT: Wesleyan University Press, 1967), 50. In 1966, Cage was involved in the intensive art and technology of 9 Evenings: Theatre and Engineering and in the DI Y electronics of David Tudor and Gordon Mumma, but had yet to use a computer. Timothy Leary had just used a technological metaphor in his LSD invitation to turn on, tune in, drop out when Cage reengineered the metaphor to express how the computer turns people into artists.

    1 . Jim Pomeroy, "Capture Images/Volatile Memory/New Montage/’ in For a Burning World Is Come to Dance Inane: Essays by and about Jim Pomeroy, ed. Timothy Druckrey and Nadine Lemmon (Brooklyn: Critical Press, 1993), 61.

    2 . Cage, A Year from Monday, 50.

    3 . John Cage, Experimental Music: Doctrine, in Silence: Lectures and Writings (Middletown, CT: Wesleyan University Press, 1961/1973), 13.

    4 . Steve Reich, Tenney, Perspectives of New Music 25, nos. 1-2 (Winter/Summer 1987): 547-48. As a matter of disclosure,

    Enjoying the preview?
    Page 1 of 1