"Following his blockbuster biography of Steve Jobs, The Innovators is Walter Isaacson's revealing story of the people who created the computer and the Internet. It is destined to be the standard history of the digital revolution and an indispensable guide to how innovation really happens. What were the talents that allowed certain inventors and entrepreneurs to turn their visionary ideas into disruptive realities? What led to their creative leaps? Why did some succeed and others fail? In his masterly saga, Isaacson begins with Ada Lovelace, Lord Byron's daughter, who pioneered computer programming in the 1840s. He explores the fascinating personalities that created our current digital revolution, such as Vannevar Bush, Alan Turing, John von Neumann, J.C.R. Licklider, Doug Engelbart, Robert Noyce, Bill Gates, Steve Wozniak, Steve Jobs, Tim Berners-Lee, and Larry Page. This is the story of how their minds worked and what made them so inventive. It's also a narrative of how their ability to collaborate and master the art of teamwork made them even more creative. For an era that seeks to foster innovation, creativity, and teamwork, The Innovators shows how they happen"--
It's a general audience history of computers and networking, and it serves well in that role. But it's also an insightful look into the role of collaboration in technical (and non-technical) advancement. Issacson spends a good deal of time looking at how innovation happens largely as a function of such collaboration - between individuals, between teams of individuals, in large groups, and between humans and the technology itself that they use.
Issacson also has some very useful observations and opinions on the roles and credit that is given to people and teams in many of these advancements. He caused me to revise some long-held opinions of the importance of the contributions of several people and teams.
An excellent history of the information age up to (almost) the present. In a readable style, with plenty of background about the important figures and a few amusing anecdotes, the author traces the origins of the computer and information technology back to the nineteenth century with the work done by Charles Babbage and Ada Lovelace. Skipping forward, however, most of the book takes place between 1940 and 2000, as the inventions and innovations of the digital age, including the computer, the microprocessor, the transistor, programming languages, software, and later search engines, came about. A good overview of how we got to where we are today and a much better read than an information technology textbook!
Besides showing how we got to the Digital Revolution, what makes Innovators so significant is its main premise. That as one thing leads to another, the spark of creativity comes from the interplay of ideas among those willing to share and collaborate. Though a book longer than I usually read, I totally enjoyed getting to know all those creative and inventive folks. Surely it was time well spent.
Isaacson starts out with Lord Byron's daughter, Ada Lovelace. That's right -- in the age of the Romantics some 150 years ago or so! She's generally credited with starting the computer revolution, as she envisioned a computing device based upon Charles Babbage's Analytical Engine. Her writings on this "engine" show what appears to be the first algorithm designed to be carried out by a machine, and as a result, she's often credited with being the world's first computer programmer. Isn't that fascinating?
The book tracks the progression of computing from the 19th century into the 20th and then into the 21st. Up comes Alan Turing, the ENIAC computer, which employed the first real programmers in history -- all of them women! -- the invention of the transistor and the microchip, Ethernet, and all of the wonderful inventions at Xerox PARC, where they invented the graphical user interface (GUI) for the computer screen, doing away with the command line prompt, the mouse, and networking, all of which was essentially stolen by Steve Jobs for the creation of the Mac. Of course, then Gates stole from him and Jobs was beside himself with the audacity. Ah, karma.
The book also introduces Gordon Moore, the originator of Moore's Law, that states that technology will double in power and possibilities every 18 months. In addition, the author hits on Grace Hopper, Andy Groves, William Shockley, Gates, Jobs, Woz, Tim Berners-Lee, the inventor of the worldwide web, Linus Trovalds, the inventor of LINUX, and the people who started Google. It's an inspiring lineup of inventors and -- key word here -- collaborators. The author believes strongly that collaboration was the key to computing development and he might be right. He provides plenty of examples of people toiling away by themselves, only to be forgotten by history for missing the boat on what would have been a great product.
The reviews of this book are pretty good. However, I read one stunning one recently that said this was the worst history he's ever read and that the biographies are mediocre. He even criticizes the author's treatment of Ada as being insufficient. I thought he did her justice. I've never even seen her mentioned anywhere else before. He spends a lot of time on her here. This reviewer was on acid and I let him know what I thought of his lousy review. If you're remotely interested in how PCs came to be, how the Internet was created and evolved, etc., et al, this is definitely a book for you to read. Recommended.
This is not a general history of digital computing. Isaacson focuses on the technologies that brought us access to personal computing and global interconnectedness through the Internet and does this through the individuals he believes contributed most to these developments. This means that the whole area of mainframe computing is ignored and IBM, arguably the dominant engine of digital computer technology in the 1960s, '70s and '80s, barely gets a mention.
As always, Isaacson writes with enthusiasm and verve, wearing his research and sources lightly. An easy and informative introduction to the history of personal digital computing and networking.
Isaacson has become the go to of the biography of the modern world, with biographies of Steve Jobs, Einstein, Kissinger, and now a “biography” of the digital revolution. Isaacson’s comfortable reading style sets us out on the path of history, showcasing not the individual, but the groups, that helped create the world as we know it. Isaacson chooses to focus on the hackers, the inventors, the entrepreneurs who placed and helped create the world we know today, beginning with greats such as Ada Lovelace--the first ever computer programmer, Alan Turin, Grace Hopper, John Mauchly, and more. He describes the ideas that inspired them and generations to come. This book should be required reading for anyone in the tech field, business, economics, history, education...well pretty much about anything to be honest. There’s a little bit of something that everyone can take from this book. I give it four out of five stars.
For those of us that lived thru the birth of the internet era, principally those of the age 65 plus generation, this is a must read book. Isaacson in my view has conceived and well written an excellent historical overview of the development of the Computer from initial concept up to and including todays public usage. The scope is grand stretching from the concepts of Ada Lovelace and Charles Babcock of the 1880 ties through to include the present advances of today’s search engines. In short how did we get to where we are today?
You do not need to be an engineering major to enjoy this tale. It is a tale of great interest to me as a Solid State Physics major/Electrical Engineer of the early sixties however, and as an aerospace - analyst / programmer. This book encapsulates the main story line of the recent history of our computer driven technological economy. Much of what happened was behind the scenes for most of us technologists of the Cold War Era. We were all mostly aware of Shockley and his team at Bell Labs inventing the solid state transistor in 1947 and of the ground breaking applications at Fairchild Semi-Conductor and Texas Instruments. But most of us were unaware of the Patent Office guffaws, concerning Microchips and the c Oxide coating and etching saga which is well investigated and recorded by Isaacson.
Isaacson seeks to determine what drives innovation and postulates that cooperation and discussion lies at the heart of innovation. However as Newton once said and as co-opted by Einstein, “If I see something new it is only because I stand on the shoulders of those giants that proceeded me.” In short I believe there-in lies the essence of the basis of innovation.
Software is given worthy coverage and the development of machine to machine computer communication and other software advances are covered with equal deftness. Personal Computers needed operating systems and Internet package switching technology and the advances in communications across the wide world web. Personal devices would not be so useful as they are today without the decades of technical development that is currently taken for granted these days. All of this development was at the heart of our technological progress which has rapidly advanced over the most recent 25 years. The development leading to the Google Search engines and Wikipedia are also well surveyed.
What next one might well ask: Where will all the innovation of the next 25 years lead us? The replacement by computers of once sole human efforts are only in their infancy. Only fools make projections but you have to respect George Orwell’s projections. Watch out for big brother!
Chronology - As captured from Isaacson’s “Innovation” & Enhanced
1843 Analytic Mechanical Devices_ Charles Babbage
1843 Concept of a Stored Program Instructions- Ada Lovelace
(1700-1850) Weaving Machine Paper punch tape controls developed
1847 Boolean Algebra –George Boole
1890 Hollerith Punch Card Readers
1931 Analog Computers-Differential Analyzers – Vannevar Bush
1935 Vacuum Tube Diode Switch Circuits-Tommy Flowers
!935 Inconsistency Principle formulated- Godel
1937 Definition of a stored Computer Program (“On Computable Numbers” -Alan Turing)
1937 Information Theory-Entropy-Claude Shannon
1939 Bletchley Park Code Breaking Digital Devices (The Polish- Bombe)-Alan Turing
1941 Other Electromechanical Digital Computers-Konrad Zuse
1943 Colossus Operational at Bletchley-Turing/Flowers
1944 ENIAC at Penn initiated-John Von Newman
1944 Harvard Mark 1 in operation-???
1945 Post War Funding for Academic & Industrial Research Announced - Vannevar Bush
!945 ENIAC now functional-???
1947 Transistor invented at Bell labs-Shockley Team
1950 Communication Concept for non-human AI postulated - Alan Turing
1952 First Computer Program Compiler developed-Grace Hopp
1954 Texas Instruments markets Transistor Radio’s
1956 Shockley founded Shockley Semiconductors
1957 Fairchild Semiconductor formed by Robert Noyce & Gordon Moore
1957 Russia launches Sputnik
1958 First IC Microchips produced- Jack Kilby
1958 ARPA initiated.
1959 Fairchild Semiconductor invents printable Microchips
1960 Man-machine interfaces defined-J C R Licklider
1960 Telecomm Packet Switching- Paul Baran at RAND
1961 US Plans Man on Moon within 10 years
1963 Computer networking proposed – Licklider
1963 Xerox PARC invents Computer Mouse & Graphic Displays
1965 Hypertext proposed for Graphic Interface interplay
1965 Moore’s Law postulated
1967 ARPANET conceptualized and funding proposed-Larry Roberts
1966 Packet Switching invented
1967 Mike Hodges transferred to USA to advance GEMINI Computer technology
1967 Russian Space Probe observes back of the Moon for the first time
1968 Intel formed by Royce, Moore & Andy Grove
1969 ARPANET transmissions initiated on West Coast
1969 First Man to walk on our Moon – Buzz Aldrin
1971 Intel 4004 microprocessor in operations
1971 Email invented – Ray Tomlinson
1972 ATARI home computers in operation
1972 INTEL release the 8008 microprocessor
1973 Alto pc created at Xerox PARC – Alan Kay
1973 Ethernet developed at Xerox PARC – Bob Metcalfe
1973 Internet TCP/IP protocols developed – Vint Cerf & Bob Kahn
1975 ALTAIR personal Computer released at MITS
1975 BASIC for ALTAIR coded- Bill Gates & Paul Allen
1975 APPLE 1 launched - Steve Wozniak & Steve Jobs
1977 APPLE II released
1978 Internet Bulletin Boards excite home computer enthusiasts
1979 IBM commissions MS to develop OS for IBM pcs
1981 HAYES Modems released for home pc’s
1983 MS releases first WINDOWS OS
1983 A free GNU OS is initiated – Richard Stallman
1984 Apple releases the MACINTOSH
1985 AOL established as an email/news-service provider
1991 LINUX OS released - Linus Torvaids
1991 WWW Wide World Web released – Tim Berners-Lee
1993 MOSAIC Browser released- Marc Anderson
1995 Wiki Wiki Web goes on-line – Ward Cunningham
1997 IBM Deep Blue beats Garry Kasparov
1998 GOOGLE Search Engine launched – Larry Page & Sergey Brin
2001 Wiki-pedia launched- Jimmy Wales & Larry Sanger
This story of how we got from Babbage's Analytical Engine, which could slowly grind out numbers, to IBM's Watson, who could win a Jeopardy match, makes a pitch for collaboration within teams and symbiosis with the machine. For Isaacson, Watson could never be a "who" but he does discuss various aspects of AI.
Isaacson doesn't overwhelm the non-technical reader with technical terms but tells us enough to appreciate the insights that pushed the concept imagined in 1843 by Ada, Countess of Lovelace and daughter of George, Lord Byron: "to bring together things, facts, ideas, conceptions in new, original, endless, ever-varying combinations."
The book did drag a bit in some places (although this may have been because I listening to the audio, which was quite long), and in other places, I would have liked more detail about some of the individual stories. I would have also liked more coverage of what's next in the digital revolution. But as a comprehensive history of the digital revolution thus far, this book works well.
If you ever programmed with punch cards, or learned assembler, or wanted to know more about ARPANET and Linux...or want to wax nostaglic, this is a great book.
Starting on the earliest computer, the Analytical Engine conceived by Charles Babbage, which he made with Byron’s daughter Ada Lovelace. It was a purely mechanical device, made at the very limits of engineering capability at the time. It took another century until the next computers surfaced. A man called Vannevar Bush was instrumental in developing a differential analyser for generating firing tables, followed in World War 2 by the Colossus at Bletchley used for attacking the Nazi Enigma codes. These new room sized contraptions used the old vacuum tube valves, and consumed vast amounts of energy and took large numbers of people to maintain and use the machines.
For computers to reach the point where you could get more than one in a room, the technology would need to be miniaturised. The team in America that achieved this using the semi conducting properties of silicon would earn themselves a Nobel Prize. This moment was the point where the modern computer age started, especially when it was realised that there could have a variety of components, and therefore circuits on a single piece of silicon. These new microchips were initially all taken by the US military for weapons, but as the price of manufacture fall, numerous commercial applications could be realised.
Some of the first products that used microchips that the general public saw were calculators, but as engineers started to use their imaginations almost anything was possible. The coming years saw the development of the first video games, personal computers that you could fit on a desk and the birth of the internet. Most of these innovations came out of one place in California that we now know as Silicon Valley. It formed a new way of working too, with unlikely collaborations, spin offs and the beginning of software and hardware companies that have now become household names.
It didn’t take too long for people to start wanting to hook computers together. The original ARPNET was a military network, but it soon had links to academia and not long after that the geeks found it. It was still a niche way of communicating, until Tim Berners-Lee invented the World Wide Web with hypertext linking, and the world was never the same again.
Isaacson has written a reasonable book on the history of computing and the internet, and the significant characters and people who discovered or made things, or who just happened to be in the right place at the right time. He covers all manner of noteworthy events right up to the present day. Mostly written from an American centric point of view, it feels like a book celebrating America’s major achievements in computing. Whilst they have had a major part to play, they have not had the stage entirely to themselves; there is a brief sojourn to Finland about Linux and CERN with Berners-Lee there is very little mention of other European.
There are some flaws though. He doesn’t mention the dark net or any of the other less salubrious activities that happen online either; ignoring them doesn’t make them go away. There is very little mention of mobile technology either. It was a book worth reading though, as he shows that some of the best innovations have come from unlikely collaborations, those that don’t follow the herd and those whose quirky personalities and way of seeing the world bring forth products that we never knew we needed.