Walter Isaacson profiles people who made the digital revolution happen, neglects others.
I recently purchased Walter Isaacson’s "The Innovators: How a group of hackers, geniuses and geeks created the digital revolution" because I wanted to learn more about the events and personalities in an important period in recent computer history, but some of the people missing from the text raise questions.
Focusing mainly, but not exclusively, on the period starting in the 1960s, the author covers the key developments of the era in chapters on all the elements involved in computer design: software, transistors, microchips, video games, the Internet, personal computing, online, and the Web.
The Innovators: How a group of hackers, geniuses, and geeks created the digital revolution
(Source: Simon & Schuster)
In these discussions Isaacson profiles the personalities who created our current digital revolution including Vannevar Bush, Alan Turing, John von Neumann, J.C.R. Licklider, Doug Engelbart, Robert Noyce, Bill Gates, Steve Wozniak, Steve Jobs, Tim Berners-Lee, and Larry Page. He explores how their minds worked and what made them so inventive, but the book is also about the collaboration and teamwork that made them “masters of innovation.”
Overall, “The Innovators” will appeal to anyone who wants to learn about the major players who made it all possible. But you may not agree with Isaacson's choices or the conclusions he draws about innovation — I didn’t. But that is precisely why you should read it.
Isaacson devotes about 100 pages of this 450-page book to Steve Jobs of Apple and Bill Gates of Microsoft; Gordon Moore and Bob Noyce of Intel occupy another 50 pages. He covers other innovators with varying degrees of completeness, giving me the impression that he allotted space in his book according to how important he felt the people were, which makes sense, I suppose. Although he touches on the innovations of a number of people I was impressed with when I met them during that period (among them Douglas Englebart, Marc Andreeson, Ted Hoff and Stanley Mazor, and Garry Kildall), Isaacson doesn’t give them the detailed coverage I think they deserve.
Douglas Englebart, for example, is responsible for much that is associated with not only personal computers, but mobile phones and smartphones as well: the computer mouse, the graphical user interface, on-screen images and icons, multiple windows on a screen, email, instant messaging, hypertext linking, digital publishing, blog-like journals, Wiki-like collaboration, document sharing, document formatting and Skype-like video conferencing.
Douglas Englebart, a mouse, and the mouse that made popular computing happen.
(Source: John Storey for TIME)
Andreeson wrote the open source and government-funded Mosaic Web browser from which all modern browsers are derived, including Microsoft's Internet Explorer. He went on to found Netscape Inc. to create a commercial version of Mosaic, which dominated the early days of Internet-connected desktops before Microsoft woke up and took that market back with its Internet Explorer based on Andreeson's original Mosaic.
Hoff and Mazor designed Intel's first commercial microprocessor, the 4004. The 4004 and its successors — such as the 8008, 8080, and the x86 architecture — transformed Intel from mainly a memory manufacturer into the processor powerhouse it is today.
I'm puzzled by the people left out of "Innovators." Obviously he had to make choices, but I came away with the feeling that he focused mainly on the winners as the best place to delve into the nature of innovation. Though some of these left-out innovators were on the losing end of some of marketing and product clashes, their experiences could have taught us a lot about innovation as well. Was it something that was missing that made them fail, or was it just the luck of the draw?
For example, there is no coverage at all of the Z80 processor family and of either Masatoshi Shima or Federico Faggin, who left Intel after contributing to the design of the original 4004 and formed Zilog Inc., where they created the Z80 processor. The Z80 dominated the personal computing market for five years in the late 1970s and early 1980s, before Intel's 8080-based IBM PC displaced it.
Barely mentioned in the book are Gary Kildall and his company, Digital Research. Kildall's CP/M desktop operating system dominated personal computing for seven years, well before the idea of an operating system even occurred to Bill Gates in 1980. As innovators, he and his designers incorporated into their operating system many new features and capabilities that are still with us today. Was it the wrong kind of innovation? Their story would have told me a lot about the nature of innovation in all its forms.
While Isaacson attempts to provide a broad view of the history of modern personal computing and communications, the choices he made about whom to include and how detailed his coverage sometimes seem haphazard and arbitrary.
After reading his acknowledgements at the end of the book, my first that was that it was a case of too many cooks. He writes that he submitted many of the chapters of his first draft for review on a blogging site where they were read by thousands and reviewed by hundreds. But now I think it was more a matter of trying to squeeze several book length topics into a single volume: the history of an era of computing, the people that made it happen, and the differences between technical and product innovation.
Nevertheless, Isaacson peppers the book with insights into character and motivation that made me stop and think. Here are a few examples:
How Douglas Englebart thought: "Most true geniuses have an instinct for simplicity," Isaacson writes. "Englebart didn't. Desiring to cram a lot of functionality into any system he built, he wanted the mouse to have many buttons, perhaps up to ten. But to his disappointment the testing determined that the optimum number a mouse should have was three. As it turned out, even that was at least one button too many."
The nature of the digital age: "The digital age may seem revolutionary, but it was based on expanding the ideas handed down from previous generations. The collaboration was not merely among contemporaries, but also between generations."
Why Intel was successful: "Throughout history the best leadership has come from teams that combined people with complementary styles," Isaacson writes. "Robert Noyce and Gordon Moore were both visionaries, which is why it was important that their first hire at Intel was Andy Grove, who knew how to impose crisp management procedures, force people to focus and get things done."
Overall I think it is a successful book. For me, a book is a success when I read it not as a passive document, but as a conversation between myself and the author with notes, comments, and questions written in the margins, and much vigorous underlining and highlighting. That means the author has managed to engage me, even if to argue with him, and all my scribbled responses guarantee I will remember much of the book’s content because it has made me think. How much more could an author — and a reader — ask?
Join over 2,000 technical professionals and embedded systems hardware, software, and firmware developers at ESC Silicon Valley July 20-22, 2015 and learn about the latest techniques and tips for reducing time, cost, and complexity in the embedded development process.
Passes for the ESC Silicon Valley 2015 Technical Conference are available at the conference’s official site with discounted advance pricing until July 17, 2015. The Embedded Systems Conference and EE Times are owned by UBM Canon.