The Innovators : How a Group of Inventors, Hackers, Geniuses, and Geeks Created the Digital Revolution

by Walter Isaacson

Hardcover, 2014




New York : Simon & Schuster, 2014


"Following his blockbuster biography of Steve Jobs, The Innovators is Walter Isaacson's revealing story of the people who created the computer and the Internet. It is destined to be the standard history of the digital revolution and an indispensable guide to how innovation really happens. What were the talents that allowed certain inventors and entrepreneurs to turn their visionary ideas into disruptive realities? What led to their creative leaps? Why did some succeed and others fail? In his masterly saga, Isaacson begins with Ada Lovelace, Lord Byron's daughter, who pioneered computer programming in the 1840s. He explores the fascinating personalities that created our current digital revolution, such as Vannevar Bush, Alan Turing, John von Neumann, J.C.R. Licklider, Doug Engelbart, Robert Noyce, Bill Gates, Steve Wozniak, Steve Jobs, Tim Berners-Lee, and Larry Page. This is the story of how their minds worked and what made them so inventive. It's also a narrative of how their ability to collaborate and master the art of teamwork made them even more creative. For an era that seeks to foster innovation, creativity, and teamwork, The Innovators shows how they happen"--… (more)

Media reviews

... even at its most rushed, the book evinces a genuine affection for its subjects that makes it tough to resist. Isaacson confesses early on that he was once “an electronics geek who loved Heathkits and ham radios,” and that background seems to have given him keen insight into how youthful passion transforms into professional obsession. His book is thus most memorable not for its intricate accounts of astounding breakthroughs and the business dramas that followed, but rather for the quieter moments in which we realize that the most primal drive for innovators is a need to feel childlike joy.

User reviews

LibraryThing member brianinbuffalo
Make no mistake, this is not a book written for a reader who is only casually interested in the Digital Revolution. I came to this realization when, after wading through the first third of the book, we were barely into the mid 1970s. As I've commented about several "comprehensive" tomes that have focused on a variety of subjects, I believe the author could have made this admittedly informative book more accessible — and a bit more enjoyable — if he had trimmed many details that simply aren't needed to provide a good grasp of digital innovations. Also, I felt that the book lacked "cohesion" in some spots. I had to give myself a one-month break before returning to "The Innovators." Having said that, I learned a lot about this complex and ever-changing topic. I ended the book a bit weary, but glad that I made the investment of time.… (more)
LibraryThing member rivkat
History of people who made computers and the internet what they are today. Ada Lovelace and Grace Hopper get good time dedicated to them. The main message Isaacson wants you to take away is that creativity and innovation are iterative; most advances, even if they have one very smart and imaginative person at their center, spread and succeed because groups of people coalesce (or already exist) to take advantage of them and to tweak them so that they work better. Various forms of digital computing, for example, were independently invented, but several just died for want of support. Relatedly, innovation requires a time and place, and social resources, sufficient to support it—Bill Gates had the computers he played with growing up; Ada Lovelace could see the future, but she didn’t have the resources to build it.… (more)
LibraryThing member Stbalbach
The Innovators is an excellent survey of the people and technology behind the invention of the computer and networking. I thought I knew a lot but there was a lot more here I didn't know. Each topic is by necessity brief but there is so much to include (and much not included) the book is long. I think Isaacson succeeded in giving a broad overview of the key people and technologies leading to the present, it's a valuable perspective and jumping off point.… (more)
LibraryThing member Keith.G.Richie
It's certainly no surprise when speaking of a Walter Issacson work - this is one of the best books of the year, one I highly recommend.

It's a general audience history of computers and networking, and it serves well in that role. But it's also an insightful look into the role of collaboration in technical (and non-technical) advancement. Issacson spends a good deal of time looking at how innovation happens largely as a function of such collaboration - between individuals, between teams of individuals, in large groups, and between humans and the technology itself that they use.

Issacson also has some very useful observations and opinions on the roles and credit that is given to people and teams in many of these advancements. He caused me to revise some long-held opinions of the importance of the contributions of several people and teams.
… (more)
LibraryThing member fpagan
Competent accounts of the contributions of people so numerous that I can only hint at them by mentioning Ada Lovelace, Alan Turing, John von Neumann, Bill Gates, Steve Jobs, Tim Berners-Lee, Jimmy Wales, and Larry Page. Glaringly missing from the evaluative digressions and summations that are included is any mention of how the whole digital revolution has gone so tragically wrong in the relentless trashing of everyone's privacy by private-sector info-abusers and government-agency surveillers. In the wake of the revelations by Edward Snowden and many others, this omission seems inexcusable.… (more)
LibraryThing member wagner.sarah35
*I received this book through GoodReads First Reads.*

An excellent history of the information age up to (almost) the present. In a readable style, with plenty of background about the important figures and a few amusing anecdotes, the author traces the origins of the computer and information technology back to the nineteenth century with the work done by Charles Babbage and Ada Lovelace. Skipping forward, however, most of the book takes place between 1940 and 2000, as the inventions and innovations of the digital age, including the computer, the microprocessor, the transistor, programming languages, software, and later search engines, came about. A good overview of how we got to where we are today and a much better read than an information technology textbook!… (more)
LibraryThing member Kathy_Dyer
Having seen the term "serial biography" used to describe Walter Isaacson's latest book, The Innovators, I was intrigued by the idea. The multitude of stories told by Isaacson to relay the "birth" of the digital age - from Ada Lovelace through the computer and its assorted pieces/parts through the birth of Silicone Valley...of both product and software...and even Al Gore's role in the Internet are enlightening. While the history may seem a bit slow in the beginning, this statement - Innovation requires articulation. - stuck with me as I read. Isaacson did a good job of simplifying the science and connecting the topics to show the progression of technology from Lovelace's "poetry in science" to Tweeting today. As Isaacson summarizes the "lessons from the journey," he starts with, "First and foremost is that creativity is a collaborative process. Innovation comes from teams more often than from the lightbulb moments of lone geniuses." Although the serial biographies are stories of individuals, these individuals formed creative teams that have lead us to the place where we live with technology enmeshed within our daily lives. If you have any interest in the evolution of computer technology and the people involved with the process, this book will be one you find informative.… (more)
LibraryThing member PaperDollLady
Innovators is about men and their machines, but it all starts with a woman--Ada Lovelace. Living in the 19th century no less, she's credited with the concept of a programming loop as expressed in some notes when Ada (a mathematics wiz) translates a French article on her mentor, Charles Babbage, and his proposed Analytical Engine. From Babbage's early designs of machines that he calls "engines" up to Google's search engines, this book is an engaging narrative history. Walter Isaacson--known for his biographies--includes all the major players. Some famous and familiar names, like Steve Jobs and Bill Gates, yet there are other contributors too, like Atanasoff or Donald Davies, who're not well known. For technology novices, like me, the opening pages have a well-noted photo timeline. Where, both there and throughout, the ladies are also remembered, not leaving out Grace Hopper (works on COBOL) and the six women programmers who worked at the University of Penn on the ENIAC computer. This book isn't just a dry and uninteresting account because its author includes many biographical snippets of the pioneering players that helps to liven up the story. I found the confrontation between Jobs and Gates over who ripped off whom particularly amusing. It's loaded with all sorts of bits and bytes like that.

Besides showing how we got to the Digital Revolution, what makes Innovators so significant is its main premise. That as one thing leads to another, the spark of creativity comes from the interplay of ideas among those willing to share and collaborate. Though a book longer than I usually read, I totally enjoyed getting to know all those creative and inventive folks. Surely it was time well spent.
… (more)
LibraryThing member scottcholstad
This book was a fascinating and entertaining history of the progression of the computer and related things, such as the Internet. I learned a lot and I'm glad I did.

Isaacson starts out with Lord Byron's daughter, Ada Lovelace. That's right -- in the age of the Romantics some 150 years ago or so! She's generally credited with starting the computer revolution, as she envisioned a computing device based upon Charles Babbage's Analytical Engine. Her writings on this "engine" show what appears to be the first algorithm designed to be carried out by a machine, and as a result, she's often credited with being the world's first computer programmer. Isn't that fascinating?

The book tracks the progression of computing from the 19th century into the 20th and then into the 21st. Up comes Alan Turing, the ENIAC computer, which employed the first real programmers in history -- all of them women! -- the invention of the transistor and the microchip, Ethernet, and all of the wonderful inventions at Xerox PARC, where they invented the graphical user interface (GUI) for the computer screen, doing away with the command line prompt, the mouse, and networking, all of which was essentially stolen by Steve Jobs for the creation of the Mac. Of course, then Gates stole from him and Jobs was beside himself with the audacity. Ah, karma.

The book also introduces Gordon Moore, the originator of Moore's Law, that states that technology will double in power and possibilities every 18 months. In addition, the author hits on Grace Hopper, Andy Groves, William Shockley, Gates, Jobs, Woz, Tim Berners-Lee, the inventor of the worldwide web, Linus Trovalds, the inventor of LINUX, and the people who started Google. It's an inspiring lineup of inventors and -- key word here -- collaborators. The author believes strongly that collaboration was the key to computing development and he might be right. He provides plenty of examples of people toiling away by themselves, only to be forgotten by history for missing the boat on what would have been a great product.

The reviews of this book are pretty good. However, I read one stunning one recently that said this was the worst history he's ever read and that the biographies are mediocre. He even criticizes the author's treatment of Ada as being insufficient. I thought he did her justice. I've never even seen her mentioned anywhere else before. He spends a lot of time on her here. This reviewer was on acid and I let him know what I thought of his lousy review. If you're remotely interested in how PCs came to be, how the Internet was created and evolved, etc., et al, this is definitely a book for you to read. Recommended.
… (more)
LibraryThing member knightlight777
An interesting history of the fascinating story of our digital age from its origins of ideas on thinking machines to Web we take for granted today. Isaacson's central theme on its development is that it takes a wide collaborative effort to push the boundaries and deliver the complex tools and services of the electronic age. This conclusion seems obvious but is no less significant. He also shows how some took the available science and ingeniously used it to put forth products and services that became wildly successful. Others who stayed within the academic side of the breakthroughs however became footnotes to this history.… (more)
LibraryThing member MrDickie
Author Walter Isaacson appeared on the weekend CSpan book program talking about this book. After the program I placed a hold on the book and waited for several weeks for a copy to be available at the county library. I made my living working in computer science for many years. I found the book fascinating. There was a lot of computer history that I wasn't aware of even though part of the time I was actively involved in what was going on. If you are interested in what happened to bring about the Digital Revolution I highly recommend "The Innovators."… (more)
LibraryThing member pierthinker
Isaacson has done more than most to bring biographical and historical rigour to the modern technological world (see his books on Steve Jobs and Albert Einstein, plus many other articles and comments). Here he attempts to describe how our digitally-focused world came to be through the invention and innovation of the digital computer and electronic networks. He selects 10 key inventions/innovations and presents narrative histories of how they came to be, including potted biographies of key players. A key theme of the book is how all of these stories are intertwined to some greater or lesser degree; how inventions and innovations are always building on what went before. Clearly, this leads to controversy and argument over primacy and ownership, all of which Isaacson notes and makes central to his theme: invention is not enough and must be accompanied by execution, exploitation and application to be true innovation and have global impact.

This is not a general history of digital computing. Isaacson focuses on the technologies that brought us access to personal computing and global interconnectedness through the Internet and does this through the individuals he believes contributed most to these developments. This means that the whole area of mainframe computing is ignored and IBM, arguably the dominant engine of digital computer technology in the 1960s, '70s and '80s, barely gets a mention.

As always, Isaacson writes with enthusiasm and verve, wearing his research and sources lightly. An easy and informative introduction to the history of personal digital computing and networking.
… (more)
LibraryThing member ohernaes
Stopped listening to the audiobook, badly read.
LibraryThing member zzshupinga
ARC provided by NetGalley

Isaacson has become the go to of the biography of the modern world, with biographies of Steve Jobs, Einstein, Kissinger, and now a “biography” of the digital revolution. Isaacson’s comfortable reading style sets us out on the path of history, showcasing not the individual, but the groups, that helped create the world as we know it. Isaacson chooses to focus on the hackers, the inventors, the entrepreneurs who placed and helped create the world we know today, beginning with greats such as Ada Lovelace--the first ever computer programmer, Alan Turin, Grace Hopper, John Mauchly, and more. He describes the ideas that inspired them and generations to come. This book should be required reading for anyone in the tech field, business, economics, history, education...well pretty much about anything to be honest. There’s a little bit of something that everyone can take from this book. I give it four out of five stars.… (more)
LibraryThing member dickmanikowski
Fairly comprehensive history of the technology which led to the development of computers and the Internet.
LibraryThing member MichaelHodges
The Innovators by David Isaacson (My Review 16 Jan 2015)
For those of us that lived thru the birth of the internet era, principally those of the age 65 plus generation, this is a must read book. Isaacson in my view has conceived and well written an excellent historical overview of the development of the Computer from initial concept up to and including todays public usage. The scope is grand stretching from the concepts of Ada Lovelace and Charles Babcock of the 1880 ties through to include the present advances of today’s search engines. In short how did we get to where we are today?
You do not need to be an engineering major to enjoy this tale. It is a tale of great interest to me as a Solid State Physics major/Electrical Engineer of the early sixties however, and as an aerospace - analyst / programmer. This book encapsulates the main story line of the recent history of our computer driven technological economy. Much of what happened was behind the scenes for most of us technologists of the Cold War Era. We were all mostly aware of Shockley and his team at Bell Labs inventing the solid state transistor in 1947 and of the ground breaking applications at Fairchild Semi-Conductor and Texas Instruments. But most of us were unaware of the Patent Office guffaws, concerning Microchips and the c Oxide coating and etching saga which is well investigated and recorded by Isaacson.
Isaacson seeks to determine what drives innovation and postulates that cooperation and discussion lies at the heart of innovation. However as Newton once said and as co-opted by Einstein, “If I see something new it is only because I stand on the shoulders of those giants that proceeded me.” In short I believe there-in lies the essence of the basis of innovation.
Software is given worthy coverage and the development of machine to machine computer communication and other software advances are covered with equal deftness. Personal Computers needed operating systems and Internet package switching technology and the advances in communications across the wide world web. Personal devices would not be so useful as they are today without the decades of technical development that is currently taken for granted these days. All of this development was at the heart of our technological progress which has rapidly advanced over the most recent 25 years. The development leading to the Google Search engines and Wikipedia are also well surveyed.
What next one might well ask: Where will all the innovation of the next 25 years lead us? The replacement by computers of once sole human efforts are only in their infancy. Only fools make projections but you have to respect George Orwell’s projections. Watch out for big brother!

Chronology - As captured from Isaacson’s “Innovation” & Enhanced
1843 Analytic Mechanical Devices_ Charles Babbage
1843 Concept of a Stored Program Instructions- Ada Lovelace
(1700-1850) Weaving Machine Paper punch tape controls developed
1847 Boolean Algebra –George Boole
1890 Hollerith Punch Card Readers
1931 Analog Computers-Differential Analyzers – Vannevar Bush
1935 Vacuum Tube Diode Switch Circuits-Tommy Flowers
!935 Inconsistency Principle formulated- Godel
1937 Definition of a stored Computer Program (“On Computable Numbers” -Alan Turing)
1937 Information Theory-Entropy-Claude Shannon
1939 Bletchley Park Code Breaking Digital Devices (The Polish- Bombe)-Alan Turing
1941 Other Electromechanical Digital Computers-Konrad Zuse
1943 Colossus Operational at Bletchley-Turing/Flowers
1944 ENIAC at Penn initiated-John Von Newman
1944 Harvard Mark 1 in operation-???
1945 Post War Funding for Academic & Industrial Research Announced - Vannevar Bush
!945 ENIAC now functional-???
1947 Transistor invented at Bell labs-Shockley Team
1950 Communication Concept for non-human AI postulated - Alan Turing
1952 First Computer Program Compiler developed-Grace Hopp
1954 Texas Instruments markets Transistor Radio’s
1956 Shockley founded Shockley Semiconductors
1957 Fairchild Semiconductor formed by Robert Noyce & Gordon Moore
1957 Russia launches Sputnik
1958 First IC Microchips produced- Jack Kilby
1958 ARPA initiated.
1959 Fairchild Semiconductor invents printable Microchips
1960 Man-machine interfaces defined-J C R Licklider
1960 Telecomm Packet Switching- Paul Baran at RAND
1961 US Plans Man on Moon within 10 years
1963 Computer networking proposed – Licklider
1963 Xerox PARC invents Computer Mouse & Graphic Displays
1965 Hypertext proposed for Graphic Interface interplay
1965 Moore’s Law postulated
1967 ARPANET conceptualized and funding proposed-Larry Roberts
1966 Packet Switching invented
1967 Mike Hodges transferred to USA to advance GEMINI Computer technology
1967 Russian Space Probe observes back of the Moon for the first time
1968 Intel formed by Royce, Moore & Andy Grove
1969 ARPANET transmissions initiated on West Coast
1969 First Man to walk on our Moon – Buzz Aldrin
1971 Intel 4004 microprocessor in operations
1971 Email invented – Ray Tomlinson
1972 ATARI home computers in operation
1972 INTEL release the 8008 microprocessor
1973 Alto pc created at Xerox PARC – Alan Kay
1973 Ethernet developed at Xerox PARC – Bob Metcalfe
1973 Internet TCP/IP protocols developed – Vint Cerf & Bob Kahn
1975 ALTAIR personal Computer released at MITS
1975 BASIC for ALTAIR coded- Bill Gates & Paul Allen
1975 APPLE 1 launched - Steve Wozniak & Steve Jobs
1977 APPLE II released
1978 Internet Bulletin Boards excite home computer enthusiasts
1979 IBM commissions MS to develop OS for IBM pcs
1981 HAYES Modems released for home pc’s
1983 MS releases first WINDOWS OS
1983 A free GNU OS is initiated – Richard Stallman
1984 Apple releases the MACINTOSH
1985 AOL established as an email/news-service provider
1991 LINUX OS released - Linus Torvaids
1991 WWW Wide World Web released – Tim Berners-Lee
1993 MOSAIC Browser released- Marc Anderson
1995 Wiki Wiki Web goes on-line – Ward Cunningham
1997 IBM Deep Blue beats Garry Kasparov
1998 GOOGLE Search Engine launched – Larry Page & Sergey Brin
2001 Wiki-pedia launched- Jimmy Wales & Larry Sanger
… (more)
LibraryThing member neddludd
After the success of his biography of Steve Jobs, Isaacson returns with a fascinating study of the process by which technology evolves--and, in the case of computer technology, many of the previously unknown individuals responsible. For example, the first text mention of a computer appeared in the early 1840s, from a fascinating woman named Ada Lovelace (whose father was Lord Byron). It took about a century--with many other contributions in fields such as physics, mathematics, materials science, and electrical engineering--to produce the first functional electronic computer. The author stresses that new inventions rarely pop up via a Eureka moment; there need to be a set of conditions that make a particular time and place fertile for innovation. Isaacson included individuals, who, because they lacked some key element, worked in isolation and no matter how brilliant their insights, were relegated to history. The book has one major fault to my taste, and that is the mini-biography the author provides for dozens of previously "invisible" contributors. This style becomes repetitive and worthy of skimming. The author's essential point is worth remembering, but the book is filled with overkill and could have used an energetic editor.… (more)
LibraryThing member Katyefk
Very well written book on the development of computers and all the amazing people and teams that created our amazing tech world. Recommended at Diamond Heart Retreat March 2015.
LibraryThing member Jeannine504
Five stars because I really, really liked it.
This story of how we got from Babbage's Analytical Engine, which could slowly grind out numbers, to IBM's Watson, who could win a Jeopardy match, makes a pitch for collaboration within teams and symbiosis with the machine. For Isaacson, Watson could never be a "who" but he does discuss various aspects of AI.

Isaacson doesn't overwhelm the non-technical reader with technical terms but tells us enough to appreciate the insights that pushed the concept imagined in 1843 by Ada, Countess of Lovelace and daughter of George, Lord Byron: "to bring together things, facts, ideas, conceptions in new, original, endless, ever-varying combinations."
… (more)
LibraryThing member neddludd
After the success of his biography of Steve Jobs, Isaacson returns with a fascinating study of the process by which technology evolves--and, in the case of computer technology, many of the previously unknown individuals responsible. For example, the first text mention of a computer appeared in the early 1840s, from a fascinating woman named Ada Lovelace (whose father was Lord Byron). It took about a century--with many other contributions in fields such as physics, mathematics, materials science, and electrical engineering--to produce the first functional electronic computer. The author stresses that new inventions rarely pop up via a Eureka moment; there need to be a set of conditions that make a particular time and place fertile for innovation. Isaacson included individuals, who, because they lacked some key element, worked in isolation and no matter how brilliant their insights, were relegated to history. The book has one major fault to my taste, and that is the mini-biography the author provides for dozens of previously "invisible" contributors. This style becomes repetitive and worthy of skimming. The author's essential point is worth remembering, but the book is filled with overkill and could have used an energetic editor.… (more)
LibraryThing member porch_reader
This book covers a lot of ground - starting in the 19th century with Ada Lovelace and Charles Babbage, and extending to present-day entrepreneurs like Gates and Jobs. With this comprehensive look at the digital revelation, Isaacson is able to identify trends that led to technological advancements. He particularly focuses on the importance of collaborations rather than the lone genius in creating and commercializing innovations. He also discusses the role of human-computer interaction, noting that the next big innovation may not involve robots who act like humans, but robot-human collaborations.

The book did drag a bit in some places (although this may have been because I listening to the audio, which was quite long), and in other places, I would have liked more detail about some of the individual stories. I would have also liked more coverage of what's next in the digital revolution. But as a comprehensive history of the digital revolution thus far, this book works well.
… (more)
LibraryThing member PhilipJHunt
I was born just a month after the first transistor was demonstrated. How the world has changed since then! And this very readable book tells the story of the digital revolution and the, often weird and quirky, characters who produced it. Or, in many cases, just missed it. And there is a woman on the book's cover for a reason.
LibraryThing member Razinha
Isaacson does a tremendous amount of research and this book is no exception. Fantastic, very readable history. I knew a considerable amount of it already from other books, but Isaacson wove an expansive story with appropriate breadth and depth.

If you ever programmed with punch cards, or learned assembler, or wanted to know more about ARPANET and Linux...or want to wax nostaglic, this is a great book.… (more)
LibraryThing member Brumby18
Exceptional - I always felt that I missed something in the internet/WWW thingummy. Gave me some perspective. In conjuction with "the long tail" (chris Anderson) I get it and how I missed it.
LibraryThing member PDCRead
Almost everything we do these days has some link to the world wide web, or involves interacting with some sort of computer, but how did these things become so pervasive and essential? In this book Isaacson writes about the people that made the companies, that made the products that we all now use.

Starting on the earliest computer, the Analytical Engine conceived by Charles Babbage, which he made with Byron’s daughter Ada Lovelace. It was a purely mechanical device, made at the very limits of engineering capability at the time. It took another century until the next computers surfaced. A man called Vannevar Bush was instrumental in developing a differential analyser for generating firing tables, followed in World War 2 by the Colossus at Bletchley used for attacking the Nazi Enigma codes. These new room sized contraptions used the old vacuum tube valves, and consumed vast amounts of energy and took large numbers of people to maintain and use the machines.

For computers to reach the point where you could get more than one in a room, the technology would need to be miniaturised. The team in America that achieved this using the semi conducting properties of silicon would earn themselves a Nobel Prize. This moment was the point where the modern computer age started, especially when it was realised that there could have a variety of components, and therefore circuits on a single piece of silicon. These new microchips were initially all taken by the US military for weapons, but as the price of manufacture fall, numerous commercial applications could be realised.

Some of the first products that used microchips that the general public saw were calculators, but as engineers started to use their imaginations almost anything was possible. The coming years saw the development of the first video games, personal computers that you could fit on a desk and the birth of the internet. Most of these innovations came out of one place in California that we now know as Silicon Valley. It formed a new way of working too, with unlikely collaborations, spin offs and the beginning of software and hardware companies that have now become household names.

It didn’t take too long for people to start wanting to hook computers together. The original ARPNET was a military network, but it soon had links to academia and not long after that the geeks found it. It was still a niche way of communicating, until Tim Berners-Lee invented the World Wide Web with hypertext linking, and the world was never the same again.

Isaacson has written a reasonable book on the history of computing and the internet, and the significant characters and people who discovered or made things, or who just happened to be in the right place at the right time. He covers all manner of noteworthy events right up to the present day. Mostly written from an American centric point of view, it feels like a book celebrating America’s major achievements in computing. Whilst they have had a major part to play, they have not had the stage entirely to themselves; there is a brief sojourn to Finland about Linux and CERN with Berners-Lee there is very little mention of other European.

There are some flaws though. He doesn’t mention the dark net or any of the other less salubrious activities that happen online either; ignoring them doesn’t make them go away. There is very little mention of mobile technology either. It was a book worth reading though, as he shows that some of the best innovations have come from unlikely collaborations, those that don’t follow the herd and those whose quirky personalities and way of seeing the world bring forth products that we never knew we needed.
… (more)



Page: 1.8297 seconds