Gregory E. Reynolds
Ordained Servant: November 2019
Also in this issue
Conflict Resolution in the Church, Part 1
by Alan D. Strange
Beza on the Trinity, 21 Theses, Part 2 (1–15)[1]
by David C. Noe
A Way Out and a Way To: Intertextuality and the Exodus Motif: A Review Article
by Meredith M. Kline
Memory and Hope: Aleksandr Solzhenitsyn and the Challenge of Exile: A Review Article
by John N. Somerville Jr.
by Eutychus II
by Anne MacDonald
How the Internet Happened: From Netscape to the iPhone, by Brian McCullough. New York: Liveright, 2018, 372 pages, $28.95.
The technological change over the past three decades is nothing short of phenomenal, in fact, it is a phenomenon like no other. Over the years of lecturing on this topic many people have asserted that this change is no different from other historical changes initiated by inventions such as the printing press and the telephone. However, Canadian scholar Arthur Boers observes that modern technological change is unique in five ways.[1] 1) Change is occurring at an unprecedented rate, leaving little time to adapt discerningly, and thus technology is overpowering culture. By contrast the change from handwritten manuscripts to the printed word took several centuries. 2) Change is artificial, separating us from nature and the real world. Matthew Crawford demonstrates the importance of the integration of manual and mental competence for living in the actual world.[2] Wendell Berry contends that the Bible is an “outdoor book.”[3] 3) Change is pervasive, dominating everything from communication to irons, restaurants to family. It tends to intrude on vacations and the Sabbath. 4) Change is not related to personal skills; rather, change is marked by such things as self-driving cars and automated airplanes. In contrast, on January 15, 2009, Captain Chesley “Sully” Sullenberger landed an Airbus A320 in New York’s freezing Hudson River by human skill that no automated system, at least at the time, could replicate. 5) Change demands universal conformity, tending to eradicate the unique, local, and diverse. The title of James Howard Kunstler’s book emphasizes this point: The Geography of Nowhere: The Rise and Decline of America’s Man-Made Landscape.[4]
How the Internet Happened is a fascinating narrative that carefully documents the dramatic change in the American social structure and economy generated by computing as it is connected with the Internet. No technology in history has had such a sudden and pervasive impact on culture. The book acquaints us with the actual history that is the backdrop to the dramatic TV series “Halt and Catch Fire.” While this history is largely descriptive, its detailed coverage of the Internet’s development and effects is of considerable assistance in the formation of a critical assessment.
McCullough begins by stating that
The Internet is the reason that computers actually became useful for the average person. . . . that is what this book is about: how the web and the Internet allowed computers to infiltrate our everyday lives. . . . It is about how we allowed these technologies into our lives, and how these technologies changed us.” (3)
There are many interesting factual surprises throughout the narrative, like learning that the “i” in Apple products is not the tribute to the individual I always thought it was, but rather refers to the Internet, since prior to the iMac, Apple was a losing player on the Internet. But the “innovative and beautifully crafted computers” (208–9) designed by Jonathan Ives were part of the Steve Jobs’s overhaul of Apple that saved the company at the turn of the century.
The linking of computers took place in the highly technical world of the US government and academic research in 1969. The earliest computers were built at the University of Illinois at Urbana-Champaign in 1951. The ARPANET linked four academic nodes together to become the grandfather of the present Internetwork. By 1990 the World Wide Web democratized the Internet by providing the graphic user interface (GUI) and connecting households. Computers were enabled to search sites through the first search engine, Gopher (3–4). In 1995 Netscape, a simpler means of navigating the Net, “was the big bang that started the Internet Era” (8). Prior to Netscape the search engine Mosaic transformed “the Internet into a workable web . . . instead of an intimidating domain of nerds” (16). Between the launches of Mosaic and its morphing into Netscape, the number of websites expanded from hundreds to tens of thousands (14). McCullough reminds us of the vastness of the change that has taken place since then, commenting: “Today, the phone in your pocket is more powerful than every computer involved in the moon landing” (8).
Along the way McCullough goes into great detail to describe the various inventors and investors who made the modern Internet a reality. Coding geniuses were cranking out new web products on Netscape in a single day and finding “hundreds and thousands of users the next” (31). One of the main themes of the history is the millions of dollars that were suddenly being made by web companies. Beginning in 1995 the dot-com era ushered in a host of overnight millionaires the like of which Wall Street had never seen (35). The competitive drama became intense, as seen in the opening sentence of chapter 2 (“Bill Gates ‘Gets’ the Internet),” which asserts, “Netscape was right to fear Microsoft” (38). Microsoft’s software and operating system (Windows 95) connected to the web through Internet Explorer proved an unbeatable combination. The goal was to make the web as mainstream as TV. The information superhighway was becoming a reality. The web by contrast with TV allowed users to consume, but also create content. Now, because Microsoft bundled Internet Explorer with every Windows machine, Netscape was outmaneuvered and soon to become extinct (38–51). Demonstrating how ruthless the competition could be, Microsoft threatened legal action when Compaq replaced Internet Explorer with Navigator on some of it models (52).
McCullough continues his narrative with a history of the development of early online services such as America Online (AOL). The discovery that people wanted to interact with each other, especially in sharing special interests, proved revolutionary, as chat rooms and electronic mail became popular (55). The advent of actual pictures on AOL in the nineties reminds us of how rapidly electronic communication developed. Meanwhile, millions were paying monthly fees for Internet access.
In chapter 4 McCullough explores big media’s discovery and use of the web. The main challenge was how to make money by providing online content (75). Print media like Wired and Rolling Stone, embraced the web as the means to technological utopia (75–6). Advertising, a centuries old business model, has proved to be largely the way online content is paid for (79). This in turn brings another major theme of Internet history to the fore: attention. Much of the web and its software are designed to capture and keep our attention (81). Chapter 6 explores the development of e-commerce. Physical reality kicks in. Jeff Bezos started with books. His idea was to become a profitable intermediary through the computer network between buyers and sellers of goods (95). Amazon soon became one of the largest companies in the world.
Chapter 5 deals with the importance of search engines. What we take for granted was not obvious or easy to invent. Google has become a verb due to its dominating search power. It opens up the world to us, but not in a neutral way, as we shall see. Notice that the ads that come with most applications can only be eliminated for a fee; it’s almost all about commerce. But what is remarkable about the ethos of this new reality is the combination of what McCullough observes: “Silicon Valley has always been equal parts egghead libertarianism and acid-tinged hippie romanticism” (108). Access to the proper means of liberation will set us all free as the Whole Earth Catalog promised to my generation of the counterculture. The Internet search engine simply enhanced this possibility exponentially (122). The irony is that even what appears to be free content contradicts the basic tenet of the romantic because free web services “make their money by whoring out our personal information to marketers and advertisers” (130). I have warned people for decades that one of the hidden dangers of social networks is that they make people surreptitiously participate in the largest focus group in history.
Fred Turner, author of From Counterculture to Cyberculture, summed up the philosophy of Stewart Brand, the editor of the Whole Earth Catalog, who
“suggested that computers might become the new LSD, a new small technology that could be used to open minds and reform society.” Indeed, Steve Jobs came up with the name “Apple Computing” from living in an acid-infused community at an Oregon apple orchid.[5]
The aspiration of Silicon Valley gurus is to change the world according to their vision of the way the world ought to be. Absent of a biblical anthropology and worldview, this is a dangerous project indeed.
Chapter 11 covers the development and dominance of Google. Recognizing the importance of relevance in web search was revolutionary in terms of the power of finding what one is looking for. The gathering and appropriate ordering of search results was key. This discovery at Stanford University was named Google after the word “googol” which means 1 followed by 100 zeros (189). As hard drive capacities grew so did search ability.
McCullough explores the dot-com era and the bubble that burst in the early part of this century. The gold rush frenzy of new dot-com company IPOs caused investment in companies that made no profit. The word “Internet” had achieved an almost magical power of attraction, often blinding investors to their lack of profit. Chapters 8 and 9 tell the sad story with a happy ending: many wild speculative investments failed, but the Internet forged ahead (180). In 1999 Time named Amazon’s Jeff Bezos its Person of the Year (157). But to realize the profits that were actually made, consider this: through the 1990s AOL’s stock appreciated 80,000%, and that’s not a typo (167–9).
“Mix, Rip, and Burn” (chapter 12) demonstrates the fruition of Apple’s aspiration to make its computers digital hubs (209). Digitizing music through iTunes and the mobile iPod proved to be the doorway to success (210–11). “Jobs was convinced that ease of use and customer choice were the key to competing with the lure of the free” (213). Linking iTunes and iPods to Windows put Apple on the path to becoming the most profitable company in the world (214).
Every other chapter throughout this book deals with the commercial aspect of the Internet. Chapter 13 covers the inception of virtual banking vehicles like PayPal. Meanwhile when Google was searching for greater profitability, it discovered the importance of getting companies to pay for search result priority (229–32). Suddenly search results became not a consequence of users’ search priorities but rather of paying advertisers.
Then there’s Web 2.0 (2004). Now the more personal and democratizing dimension of the web’s potential came to the fore with web-logs (blogs), Wikipedia, YouTube, and the social networks. Now the idea of participation dominated. Creation of web content became pervasive on the Internet (255). “Web 2.0 was about people expressing themselves—actually being themselves, actually living—online.” The boundary between online and real life was blurred and broken (258–59).
Enter the social network (chapter 15) and its biggest player, Mark Zuckerberg.[6] He was already used to stealing content and violating privacy during the nascent development of his social network ideas and skills at Harvard. One of his inventions for Harvard students, Facemash, was shut down because he had stolen student profile pictures from Harvard’s internal network. Zuckerberg was placed on probation then (267), and now, years later, he seems to be under the scrutiny of Congress. In the early days of Facebook observing the server logs and discovering user behavior enabled Zuckerberg and others to call what they observed “the trance” (281). To put it crudely, cultivating addiction is the best way to increase profits. McCullough observes that the genius of Zuckerberg’s discovery is that “finding out what is happening with your friends and family is a core human desire, right smack in the middle of Maslow’s hierarchy of needs” (283). The “Facebook trance” lead to the proliferation of content, including feeds like News Feed (288), which included a certain degree of biased curation.
Chapters 16–17 conclude the body of the book with an exploration of the rise of mobile media, especially the iPhone. The PalmPilot was the first mobile device to be widely used. Then the Blackberry moved into first place with the slogan “Always on. Always connected” (299). I can still remember my adult children doing business on vacation with their Blackberries. Of course, there were more limited early mobile devices like pagers and MP3 players, but the Blackberry was the first true “heroin of mobile computing” (300). I say first because the smartphone outpaced them all.
The history of the iPhone’s development within Apple is fascinating, and I’ll leave that topic to interested readers. The combination of Jonathan Ives’s stunningly elegant design, the enormous computing power, and connection to the App Store and iTunes, of what is far more than a mere phone, make the iPhone a large component of the culture-changing influence of electronic media. “Rather than arrive too soon, the smartphone+social media represented a moment when two world-changing technologies arrived at just the right moment” (320).
McCullough’s conclusion, “Outro,” sounds a warning about the lofty utopianism that has fueled much of the Internet’s development. J. C. R. Licklider, an early developer of the ARPANET wrote a philosophically foundational paper titled “Man-Computer Symbiosis” in which he asserts: “Preliminary analyses indicate that the symbiotic partnership will perform intellectual operations more effectively than man alone can perform them” (322). After summing up what the Internet Era has astonishingly accomplished, McCullough asks: “But are we better off? Are we truly thinking as no human brain has ever thought, just as Licklider supposed? That’s the open-ended question as the Internet Era continues” (323). This question reminds me of the first electronic message sent by telegraph, “What hath God wrought!” I have proposed turning it into a question, “What hath God wrought?” We need to be like people of David’s troops, “men who had understanding of the times, to know what Israel ought to do” (1Chron. 12:32). “Do not be conformed to this world, but be transformed by the renewal of your mind, that by testing you may discern what is the will of God, what is good and acceptable and perfect” (Rom. 12:2). This comprehensive history can aid the technology navigator in wise stewardship of the new environment in which we find ourselves. I highly recommend this book.
[1] Arthur Boers, “Open the Wells of Grace and Salvation: Creative and Redemptive Potential of Technology in Today’s Church” (lecture at the conference From the Garden to the Sanctuary: The Promise and Challenge of Technology, Gordon Conwell Theological Seminary, June 6, 2013).
[2] Matthew B. Crawford, The World beyond Your Head: On Becoming an Individual in an Age of Distraction (New York: Farrar, Straus and Giroux, 2015).
[3] Wendell Berry, “Christianity and the Survival of Creation,” in Sex, Economy, Freedom, and Community (New York: Random House, 1993); reprinted in Cross Currents 43, no. 2 (Summer 93): 149, https://www.crosscurrents.org/berry.htm.
[4] James Howard Kunstler, The Geography of Nowhere: The Rise and Decline of America’s Man-Made Landscape (New York: Free Press, 1994).
[5] Jon Askonas, “How Tech Utopia Fostered Tyranny,” The New Atlantis, no. 57 (Winter 2019): 6–7; quoting Fred Turner, From Counterculture to Cyberculture: Stewart Brand, the Whole Earth Network, and the Rise of Digital Utopianism (Chicago: University of Chicago Press, 2006).
[6] See my review of the film The Social Network, “Dis-integrated?: The Social Network,” Ordained Servant (2010): 62–66, https://opc.org/os.html?article_id=222.
Gregory E. Reynolds is pastor emeritus of Amoskeag Presbyterian Church (OPC) in Manchester, New Hampshire, and is the editor of Ordained Servant. Ordained Servant, November 2019.
Contact the Editor: Gregory Edward Reynolds
Editorial address: Dr. Gregory Edward Reynolds,
827 Chestnut St.
Manchester, NH 03104-2522
Telephone: 603-668-3069
Electronic mail: reynolds.1@opc.org
Ordained Servant: November 2019
Also in this issue
Conflict Resolution in the Church, Part 1
by Alan D. Strange
Beza on the Trinity, 21 Theses, Part 2 (1–15)[1]
by David C. Noe
A Way Out and a Way To: Intertextuality and the Exodus Motif: A Review Article
by Meredith M. Kline
Memory and Hope: Aleksandr Solzhenitsyn and the Challenge of Exile: A Review Article
by John N. Somerville Jr.
by Eutychus II
by Anne MacDonald
© 2024 The Orthodox Presbyterian Church