The Innovators: How A Group Of Hackers, Geniuses, And Geeks Created The Digital Revolution, by Walter Isaacson
After writing a biography of Steve Jobs, the late and enigmatic leader of Apple’s drive for aesthetic and technological superiority over competitors, it makes sense that the author would seek to write a collaborate and somewhat dishy book that sought to deal with the sprawling subject of technological innovation in the world of computing that starts in the early 19th century and continues to the contemporary period. The approach has considerable merit, not least that the author is knowledgeable about the subject and has done a great deal of research into it, and also that the author has some serious points to make about the culture of innovation that is well worth understanding and appreciating. Although I must admit I found some aspects of this picture of innovation to be unappealing, I did find it instructive the way that the author discussed the culture of innovation that led to the digital revolution and why it appears (with reason) that this culture acts in such a biased and leftist fashion in the enforcement of community standards. For that alone this book is worth the hefty read, even if one has little interest in the people who have been responsible for the digital revolution throughout the decades.
This book consists of twelve chapters over nearly 500 pages of text that seeks to convey the sweeping epic of technological innovation in the computing world. The author begins with Ada Lovelace, daughter of Lord Byron, and explores her personal life and her contributions to computing along with those of her contemporaries like Babbage (1). After that the author moves on to the development of the computer and the various controversies over prior invention (something that continues on through the course of the book) as well as the insights of people like Turing (2). The author then looks at early programmers, pointing to the role of women in these efforts (3). The author returns to hardware concerns with a chapter on transistors (4) and the microchip (5), examining the way that multiple parties were seeking the same solutions simultaneously and working collaboratively. This leads to a short chapter on video games that looks at the early role of Atari (6), before the author discusses the public-private collaborations that led to the internet (7). The author discusses various struggles to develop the personal computer (8) and the savvy recognition on the part of Bill Gates that software would be important in making it work (9). The author then looks at the importance of AOL in helping people get online (10) and the early development of the worldwide web (11) that has led to a world where human beings and computers can work together profitably, perhaps for a long while to come (12).
There are a few valuable insights that this book provides. The author offers a defense of Al Gore’s role in promoting the internet, discusses the vital importance of left-wing counterculture elements in fostering an environment that encouraged personal computing, and discusses fights over the question of community property or corporate profit that have surfaced time and time again. The author makes a strong case for a role of government in creating public goods like the infrastructure of the internet that can serve for the well-being of people in the United States and around the world. The author’s discussion of the marked left-wing bias of many involved in the world of Silicon Valley and personal computing in general can help the reader understand the politically biased nature of the behavior of companies like Facebook, Google, and Twitter, which routinely attack conservative perspectives while turning a blind eye to leftist excesses. By wrestling with the political nature of innovation, the author has done the reader a great service even if his definition of innovation is quite a bit more narrow than one would want.