24 C O M M U N I C AT I O N S O F T H E A C M | S E P T E M B E R 2 0 1 4 | V O L . 5 7 | N O . 9 V viewpoints Historical Reflections We Have Never Been Digital Reflections on the intersection of computing and the humanities. that point no American company had yet applied a computer to administra- tive work, and when they did the results would almost invariably disappoint. The machines needed more people than anticipated to tend them, took lon- ger to get running, and proved less flex- ible. So why did hundreds of companies rush into computerization before its economic feasibility was established? Worthington had warned that “The first competitor in each industry to oper- ate in milliseconds, at a fraction of his former overhead, is going to run rings around his competition. There aren’t many businesses that can afford to take a chance on giving this fellow a five-year lead. Therefore, most of us have to start now, if we haven’t started already.”a Following his belief that “the omi- nous rumble you sense is the future coming at us.” Worthington was soon to give up his staff job at Hughes Aircraft in favor of a consulting role, promoting his own expertise as a guide toward the electronic future. He had promised that “We can set our course toward push-but- ton administration, and God willing we can get there.” Similar statements were being made on the pages of the Harvard Business Review and in speeches deliv- ered by the leaders of IBM and other business technology companies as a a W.B. Worthington. “Application of Electronics to Administrative Systems,” Systems and Proce- dures Quarterly 4, 1 (Feb. 1953), 8–14. Quoted in T. Haigh, “The Chromium-Plated Tabula- tor: Institutionalizing an Electronic Revolu- tion, 1954–1958,” IEEE Annals of the History of Computing 23, 4 (Oct.–Dec. 2001), 75–104. T H I S C O L U M N I S inspired by the fashionable concept of the “digital humanities.” That will be our destination rather than our starting point, as we look back at the long history of the idea that adoption of computer technology is a revolutionary moment in human histo- ry. Along the way we will visit the work of Nicholas Negroponte and Bruno Latour, whose books Being Digital and We Have Never Been Modern I splice to suggest that we have, in fact, never been digital. The computer is not a particularly new invention. The first modern com- puter programs were run in 1948, long before many of us were born. Yet for decades it was consistently presented as a revolutionary force whose immi- nent impact on society would utterly transform our lives. This metaphor of “impact,” conjuring images of a bulky asteroid heading toward a swamp full of peacefully grazing dinosaurs, presents technological change as a violent event we need to prepare for but can do noth- ing to avert. Discussion of the looming revolution tended to follow a pattern laid out in the very first book on electronic computers written for a broad audience: Edmund Callis Berkeley’s 1949 Giant Brains: Or Machines That Think.1 Ever since then the computer has been surrounded by a cloud of promises and predications, de- scribing the future world it will produce. The specific machines described in loving detail by Berkeley, who dwelled on their then-novel arrangements of re- lays and vacuum tubes, were utterly ob- solete within a few years. His broader hopes and concerns for thinking ma- chines, laid out in chapters on “what they might do for man” and “how soci- ety might control them” remain much fresher. For example, he discussed the potential for autonomous lawnmow- ers, automated translation, machine dictation, optical character recogni- tion, an “automatic cooking machine controlled by program tapes,” and a system by which “all the pages of all books will be available by machine.” “What,” he asked, “shall I do when a ro- bot machine renders worthless all the skills I have spent years in developing?” Computer systems have always been sold with the suggestion they represent a ticket to the future. One of my favorite illustrations of this comes from 1953, when W.B. Worthington, a business sys- tems specialist, promised at a meeting of his fellows that “the changes ahead appear to be similar in character but far beyond those effected by printing.” At DOI:10.1145/2644148 Thomas Haigh V viewpoints Computer systems have always been sold with the suggestion they represent a ticket to the future. http://dx.doi.org/10.1145/2644148 S E P T E M B E R 2 0 1 4 | V O L . 5 7 | N O . 9 | C O M M U N I C AT I O N S O F T H E A C M 25 viewpoints V viewpoints cal concept within computing. It began as one of the two approaches to high- speed automatic computation back in the 1940s. The new breed of “comput- ing machinery,” after which the ACM was named, was called digital because the quantities the computer calculated with were represented as numbers. That is to say they were stored as a series of digits, whether on cog wheels or in elec- tronic counters, and whether they were manipulated as decimal digits or the 0s and 1s of binary. This contrasted with the better-established tradition of ana- log computation, a term derived from the word “analogy.” In an analog device an increase in one of the quantities be- ing modeled is represented by a corre- sponding increase in something inside the machine. A disc rotates a little faster; a voltage rises slightly; or a little more fluid accumulates in a chamber. Tradi- tional speedometers and thermometers are analog devices. They creep up or down continuously, and when we read off a value we look for the closest num- ber marked on the gauge. Throughout the 1950s and 1960s ana- log and digital computers coexisted. The titles of textbooks and university classes would include the word “analog” or “dig- ital” as appropriate to avoid confusion. Eventually the increasing power and re- liability of digital computers and their broad social alliance assembled itself behind the new technology. After this initial surge of interest in computerization during the 1950s there have been two subsequent peaks of en- thusiasm. During the late 1970s and early 1980s the world was awash with discussion of the information society, post-industrial society, and the micro- computer revolution. There followed, in the 1990s, a wave of enthusiasm for the transformative potential of computer networks and the newly invented World Wide Web. Rupture Talk and Imaginaires Discussion of the “computer revolution” was not just cultural froth whipped up by the forces of technological change. Instead the construction of this shared vision of the future was a central part of the social process by which an unfa- miliar new technology became a central part of American work life. Patrice Flichy called these collective visions “imagi- naires” and has documented their im- portance in the rapid spread of the In- ternet during the 1990s.2 Rob Kling, a prolific and influential researcher, wrote extensively on the importance of “com- puterization movements” within orga- nizations and professional fields.5 Historian of technology Gabrielle Hecht called such discussion “rupture talk” in her discussion of the enthusi- asm with which France reoriented its co- lonial power and engineering talent dur- ing the 1950s around mastery of nuclear technology.4 This formulation captures its central promise: that a new technol- ogy is so powerful and far-reaching it will break mankind free of history. Details of the utopian new age get filled in accord- ing to the interests, obsessions, and po- litical beliefs of the people depicting it. That promise is particularly appealing to nations in need of a fresh start and a boost of confidence, as France then was, but its appeal seems to be universal. This dismissal of the relevance of experi- ence or historical precedent carries out a kind of preventative strike on those who might try to use historical parallels to ar- gue that the impact of the technology in question might in fact be slower, more uneven, or less dramatic than promised. Yet this fondness for rupture talk is itself something with a long history around technologies such as electric power, te- legraphy, air travel, and space flight. Enter “The Digital” One of the most interesting of the clus- ter of concepts popularized in the early 1990s to describe the forthcoming revo- lution was the idea of “the digital” as a new realm of human experience. Digital had, of course, a long career as a techni- 26 C O M M U N I C AT I O N S O F T H E A C M | S E P T E M B E R 2 0 1 4 | V O L . 5 7 | N O . 9 viewpoints falling cost squeezed analog computers out of the niches, such as paint mixing, in which they had previously been pre- ferred. Most analog computer suppliers left the industry, although Hewlett-Pack- ard made a strikingly successful transi- tion to the digital world. By the 1970s it was generally no longer necessary to prefix computer with “digital” and con- sequently the word was less frequently encountered in computing circles. “Digital” acquired a new resonance from 1993, with the launch of the in- stantly fashionable Wired magazine. In the first issue of Wired its editor pro- claimed the “the Digital Revolution is whipping through our lives like a Ben- gali typhoon,” just as enthusiasm was building for the information superhigh- way and the Internet was being opened to commercial use. Wired published lists of the “Digerati”—a short-lived coinage conservative activist and proph- et of unlimited bandwidth George Gilder used to justify something akin to People’s list of the sexiest people alive as judged on intellectual appeal to lib- ertarian techno geeks. The magazine’s title evoked both electronic circuits and drug-heightened fervor. As Fred Turner showed in his book From Counter Cul- ture to Cyberculture, Wired was one in a series of bold projects created by a shifting group of collaborators orbiting libertarian visionary Steward Brand.8 Brand had previously created the Whole Earth Catalog back in the 1960s and a pi- oneering online community known as the WELL (Whole Earth ‘Lectronic Link) in the 1980s. His circle saw technology as a potentially revolutionary force for personal empowerment and social transformation. In the early 1990s this held together an unlikely alliance, from Newt Gingrich who as House Speaker suggested giving laptops to the poor rather than welfare payments, to the fu- turist Alvin Toffler, U.S. Vice President Al Gore who championed government support for high-speed networking, and Grateful Dead lyricist John Perry Barlow who had founded the Electronic Fron- tier Foundation to make sure that the new territory of “cyberspace” was not burdened by government interference. One of the magazine’s key figures, Nicholas Negroponte, was particularly important in promoting the idea of “the digital.” Negroponte was the entrepre- neurial founder and head of MIT’s Me- dia Lab, a prominent figure in the world of technology whose fame owed much to a book written by Brand. Negroponte took “digital” far beyond its literal mean- ing to make it, as the title of his 1995 book Being Digital, suggested, the defin- ing characteristic of a new way of life. This was classic rupture talk. His central claim was that in the past things “made of atoms” had been all important. In the future everything that mattered would be “made of bits.” As I argued in a previous column, all information has an underlying ma- terial nature.3 Still, the focus on digi- tal machine-readable representation made some sense: the computer is an exceptionally flexible technology whose applications gradually expanded from scientific calculation to business administration and industrial control to communication to personal enter- tainment as their speed has risen and their cost fallen. Each new application meant representing a new aspect of the world in machine-readable form. Like- wise, the workability of modern com- puters depended on advances in digital electronics and conceptual develop- ments in coding techniques and infor- mation theory. So stressing the digital nature of computer technology is more revealing than calling the computer an “information machine.” Here is a taste of Being Digital: “Ear- ly in the next millennium, your left and right cuff links or earrings may com- municate with each other by low-orbit- ing satellites and have more computer power than your present PC. Your tele- phone won’t ring indiscriminately; it will receive, sort, and perhaps respond to your calls like a well-trained English butler. Mass media will be refined by systems for transmitting and receiv- ing personalized information and entertainment. Schools will change to become more like museums and playgrounds for children to assemble ideas and socialize with children all over the world. The digital planet will look and feel like the head of a pin. As we interconnect ourselves, many of the values of a nation-state will give way to those of both larger and small- er communities. We will socialize in digital neighborhoods in which physi- cal space will be irrelevant and time will play a different role. Twenty years from now, when you look out of a win- dow what you see may be five thousand miles and six time zones away…” Like any expert set of predictions this cluster of promises extrapolated social and technology change to yield a mix of the fancifully bold, the spot-on, and the overly conservative. Our phones do support call screening, although voice communication seems to be dwindling. Online communities have contributed to increased cultural and political po- larization. Netflix, Twitter, blogs, and YouTube have done more than “refine” mass media. As for those satellite cuff links, well the “Internet of Things” remains a fu- turistic vision more than a daily reality. As the career of the “cashless society” since the 1960s has shown, an imagi- naire can remain futuristic and excit- ing for decades without ever actually arriving.b However, when the cuff links of the future do feel the need to com- municate they seem more likely to chat over local mesh networks than precious satellite bandwidth. This prediction was perhaps an example of the role of future visions in promoting the interests of the visionary. Negroponte was then on the board of Motorola, which poured bil- lions of dollars into the Iridium network of low-earth orbit satellites for phone and pager communication. That busi- ness collapsed within months of launch in 1998 and plans to burn up the satel- lites to avoid leaving space junk were canceled only after the U.S. defense de- partment stepped in to fund their con- tinued operation. b A phenomenon I explore in more detail in B. Batiz-Lazo, T. Haigh, and D. Steans, “How the Future Shaped the Past: The Case of the Cash- less Society,” Enterprise and Society, 36, 1 (Mar. 2014), 4–17. A wave of enthusiasm for “the digital” has swept through humanities departments worldwide. S E P T E M B E R 2 0 1 4 | V O L . 5 7 | N O . 9 | C O M M U N I C AT I O N S O F T H E A C M 27 viewpoints Eroding the Future Of course we never quite got to the digital future. My unmistakably analog windows show me what is immediately outside my house. Whether utopian or totalitarian, imagined future worlds tend to depict societies in which ev- ery aspect of life has changed around a particular new technology, or everyone dresses in a particular way, or everyone has adopted a particular practice. But in reality as new technologies are assimi- lated into our daily routines they stop feeling like contact with an unfamiliar future and start seeming like familiar objects with their own special character. If a colleague reported that she had just ventured into cyberspace after booking a hotel online or was considering taking a drive on the information superhigh- way to send email you would question her sincerity, if not her sanity. These metaphors served to bundle together different uses of information technol- ogy into a single metaphor and distance them from our humdrum lives. Today, we recognize that making a voice or vid- eo call, sending a tweet, reading a Web page, or streaming a movie are distinct activities with different meanings in our lives even when achieved using the same digital device. Sociologist Bruno Latour, a giant in the field of science studies, captured this idea in the title of his 1993 book We Have Never Been Modern, published just as Ne- groponte began to write his columns for Wired. Its thesis was that nature, tech- nology, and society have never truly been separable despite the Enlightenment and Scientific Revolution in which their separation was defined as the hallmark of modernity. Self-proclaimed “mod- erns” have insisted vocally on these sepa- rations while in reality hybridizing them into complex socio-technical systems. Thus, he asserts “Nobody has ever been modern. Modernity has never begun. There has never been a modern world.”6 Latour believed that “moderns,” like Negroponte, see technology as some- thing external to society yet also as something powerful enough to define epochs of human existence. As Latour wrote, “the history of the moderns will be punctuated owing to the emergence of the nonhuman—the Pythagorean theorem, heliocentrism…the atomic bomb, the computer…. People are go- ing to distinguish the time ‘BC’ and ‘AC’ with respect to computers as they do the years ‘before Christ’ and ‘after Christ’.” He observed that rhetoric of revolu- tion has great power to shape history, writing that “revolutions attempt to abolish the past but they cannot do so…” Thus we must be careful not to endorse the assumption of a historical rupture as part of our own conceptual framework. “If there is one thing we are incapable of carrying out,” Latour asserted, “it is a revolution, whether it be in science, technology, politics, or philosophy.…” Our world is inescapably messy, a constant mix of old and new in every area of culture and technology. In one passage Latour brought things down to earth by discussing his home repair tool- kit: “I may use an electric drill, but I also use a hammer. The former is 35 years old, the latter hundreds of thousands. Will you see me as a DIY expert ‘of con- trasts’ because I mix up gestures from different times? Would I be an ethno- graphic curiosity? On the contrary: show me an activity that is homogenous from the viewpoint of the modern time.” According to science fiction writer William Gibson, “The future is al- ready here—it’s just not very evenly distributed.”c That brings me comfort as a historian because of its logical cor- ollary, that the past is also mixed up all around us and will remain so.d Even Ne- groponte acknowledged the uneven na- c The sentiment is Gibson’s, although there is no record of him using those specific words until after they had become an aphorism. See http://quoteinvestigator.com/2012/01/24/ future-has-arrived/. d Gibson himself appreciates this, as I have discussed elsewhere T. Haigh, “Technology’s Other Storytellers: Science Fiction as History of Technology,” in Science Fiction and Comput- ing: Essays on Interlinked Domains, D.L. Ferro and E.G. Swedin, Eds., McFarland, Jefferson, N.C., 2011, 13–37 ture of change. Back in 1997, in his last column for Wired, he noted that “digital” was destined for banality and ubiquity as “Its literal form, the technology, is al- ready beginning to be taken for granted, and its connotation will become tomor- row’s commercial and cultural compost for new ideas. Like air and drinking wa- ter, being digital will be noticed only by its absence, not its presence.”7 Digital Humanities Even after once-unfamiliar technolo- gies dissolve into our daily experience, rupture talk and metaphors of revolu- tion can continue to lurk in odd and un- predictable places. While we no longer think of the Internet as a place called “cyberspace” the military-industrial complex seems to have settled on “cy- ber warfare” as the appropriate name for online sabotage. Likewise, the NSF has put its money behind the idea of “cyberinfrastructure.” The ghastly practice of prefixing things with an “e” has faded in most realms, but “e-com- merce” is hanging on. Like most other library schools with hopes of contin- ued relevance my own institution has dubbed itself an “iSchool,” copying the names of Apple’s successful consumer products. There does not seem to be any particular logic behind this set of prefixes and we might all just as well have settled on “iWarfare,” “cybercom- merce” and “e-school.” But these terms will live on, vestiges of the crisp future vision that destroyed itself by messily and incompletely coming true. The dated neologism I have been hearing more and more lately is “the digital humanities.” When I first heard someone describe himself as a “digital historian” the idea that this would be the best way to describe a historian who had built a website seemed both preten- tious and oddly outdated. Since then, however, a wave of enthusiasm for “the digital” has swept through humanities departments nationwide. According to Matthew Kirschen- baum, the term “digital humanities” was first devised at the University of Virginia back in 2001 as the name for a mooted graduate degree program. Those who came up with it wanted something more exciting than “humanities computing” and broader than “digital media,” two established alternatives. It spread wide- ly through the Blackwell Companion to 28 C O M M U N I C AT I O N S O F T H E A C M | S E P T E M B E R 2 0 1 4 | V O L . 5 7 | N O . 9 viewpoints the Digital Humanities issued in 2004. As Kirschenbaum noted, the reasons be- hind the term’s spread have “primarily to do with marketing and uptake” and it is “wielded instrumentally” by those seeking to further their own careers and intellectual agendas. In this human- ists are not so different from Worthing- ton back in the 1950s, or Negroponte and his fellow “digerati” in the 1990s, though it is a little incongruous that they appropriated “the digital” just as he was growing tired of it. The digital humanities movement is a push to apply the tools and methods of computing to the subject matter of the humanities. I can see why young hu- manists trained in disciplines troubled by falling student numbers, a perceived loss of relevance, and the sometimes alienating hangover of postmodernism might find something liberating and empowering in the tangible satisfaction of making a machine do something. Self-proclaimed digital humanists have appreciably less terrible prospects for employment and grant funding as a hu- manist than the fusty analog variety. As Marge Simpson wisely cautioned, “don’t make fun of grad students. They just made a terrible life choice.” It is not clear exactly what makes a humanist digital. My sense is the bound- ary shifts over time, as one would have to be using computers to do something that most of one’s colleagues did not know how to do. Using email or a word processing program would not qualify, and having a homepage will no longer cut it. Installing a Web content manage- ment system would probably still do it, and anything involving programming or scripting definitely would. In fact, digital humanists have themselves been arguing over whether a humanist has to code to be digital, or if writing and think- ing about technology would be enough. This has been framed by some as a dis- pute between the virtuous modern im- pulse to “hack” and the ineffectual tra- ditional humanities practice of “yack.” As someone who made a deliberate (and economically rather perverse) choice to shift from computer science to the his- tory of technology after earning my first masters’ degree, I find this glorification of technological tools a little disturbing. What attracted me to the humanities in the first place was the promise of an in- tellectual place where one could under- stand technology in a broader social and historical context, stepping back from the culture of computer enthusiasm that valued coding over contemplating and technological means over human ends. There is a sense in which historians of information technology work at the intersection of computing and the hu- manities. Certainly we have attempted, with rather less success, to interest humanists in computing as an area of study. Yet our aim is, in a sense, the op- posite of the digital humanists: we seek to apply the tools and methods of the humanities to the subject of computing (a goal shared with newer fields such as “platform studies” and “critical code studies”). The humanities, with their broad intellectual perspective and criti- cal sensibility, can help us see beyond the latest fads and think more deeply about the role of technology in the mod- ern world. Social historians have done a great job examining the history of ideas like “freedom” and “progress,” which have been claimed and shaped in differ- ent ways by different groups over time. In the history of the past 60 years ideas like “information” and “digital” have been similarly powerful, and deserve similar scrutiny. If I was a “digital histo- rian,” whose own professional identity and career prospects came from evan- gelizing for “the digital,” could I still do that work? There are many ways in which new software tools can contribute to teach- ing, research, and dissemination across disciplines, but my suspicion is that the allure of “digital humanist” as an identity will fade over time. It en- compasses every area of computer use (from text mining to 3D world build- ing) over every humanities discipline (from literary theory to classics). I can see users of the same tools in different disciplines finding an enduring con- nection, and likewise users of different tools in the same discipline. But the tools most useful to a particular disci- pline, for example the manipulation of large text databases by historians, will surely become part of the famil- iar scholarly tool set just as checking a bank balance online no longer feels like a trip into cyberspace. Then we will recognize, to adapt the words of Latour, that nobody has ever been digital and there has never been a digital world. Or, for that matter, a digital humanist. Further Reading Gold, M.K., Ed. Debates in the Digital Humanities, University of Minnesota Press, 2012. Also at http:// dhdebates.gc.cuny.edu/. Broad coverage of the digital humanities movement, including its history, the “hack vs. yack” debate, and discussion of the tension between technological enthusiasm and critical thinking. Gibson, W. Distrust that Particular Flavor, Putnam, 2012. A collection of Gibson’s essays and nonfiction, including his thoughts on our obsession with the future. Latour, B. Science in Action: How to Follow Scientists and Engineers through Society. Harvard University Press, 1987 and B. Latour and S. Woolgar, Laboratory Life: The Construction of Scientific Facts. Princeton University Press, 1986. We Have Never Been Modern is not the gentlest introduction to Latour, so I suggest starting with one of these clearly written and provocative studies of the social practices of technoscience. Marvin, C. When Old Technologies Were New: Thinking About Electric Communication in the Late Nineteenth Century. Oxford University Press, 1988. The hopes and fears attributed to telephones and electrical light when they were new provide a startlingly close parallel with the more recent discourse around computer technology. Morozov, E. To Save Everything, Click Here, Perseus, 2013. A “digital heretic” argues with zest against the idea of the Internet as a coherent thing marking a rupture with the past. Winner, L. The Whale and the Reactor: A Search for Limits in an Age of High Technology. University of Chicago Press, 1986. A classic work in the philosophy of technology, including a chapter “Mythinformation” probing the concept of the “computer revolution.” References 1. Berkeley, E.C. Giant Brains or Machines That Think. Wiley, NY, 1949. 2. Flichy, P. The Internet Imaginaire. MIT Press, Cambridge, MA, 2007. 3. Haigh, T. Software and souls; Programs and packages. Commun. ACM 56, 9 (Sept. 2013), 31–34. 4. Hecht, G. Rupture-talk in the nuclear age: Conjugating colonial power in Africa. Social Studies of Science 32, 6 (Dec. 2002). 5. Kling, R. Learning about information technologies and social change: The contribution of social informatics. The Information Society 16, 3 (July–Sept. 2000), 217–232. 6. Latour, B. We Have Never Been Modern. Harvard University Press, Cambridge, MA, 1993. 7. Negroponte, N. Beyond digital. Wired 6, 12 (Dec. 1998). 8. Turner, F. From Counterculture to Cyberculture: Stewart Brand, the Whole Earth Network, and the Rise of Digital Utopianism. University of Chicago Press, Chicago, 2006. Thomas Haigh (thaigh@computer.org) is an associate professor of information studies at the University of Wisconsin, Milwaukee, and chair of the SIGCIS group for historians of computing. Copyright held by author.