From the invention of scripts and alphabets to the long misunderstood "talking drums" of Africa, James Gleick tells the story of information technologies that changed the very nature of human consciousness. He also provides portraits of the key figures contributing to the inexorable development of our modern understanding of information, including Charles Babbage, Ada Byron, Samuel Morse, Alan Turing, and Claude Shannon.
I was already familiar with a lot of the scientific and mathematical ideas here, so the parts I found most interesting involved the ways in which our worldviews and our ways of thinking about knowledge and information have changed over the centuries. I might almost have preferred more of an emphasis on that, actually, but I'm not really complaining. Overall this was well-written, thought-provoking, and definitely worth reading.
The book includes an early, slightly off-kilter discussion of the origins of information in early writing systems. Some conclusions here seem questionable. The genealogy of alphabet --> information seemed incomplete and perhaps a little Eurocentric.
Some parts seem to overreach. On page 16, for instance, Homer, Virgil, and Aeschylus are cited as sources who recount the use of fire beacons in the Trojan War. The Homeric epics are no guide to 12th century Mediterranean technologies, since they are considered to contain numerous 8th century anachronisms. Certainly, Virgil and Aeschylus are even further removed. If the best source for this information was poetry, several centuries removed, then the example (which was not vital to the chapter) should probably have been left out. One could make too much of this, but a misstep occurring so early in the book made me wary (probably unnecessarily so) of other conclusions and summaries offered by the author.
The section on the information deluge was appropriate in its length and coverage; so much has been written about this, elsewhere and recently, that this section could easily have been overdone. I would have appreciated attention to Jaron Lanier, here, but this is a personal preference and not a necessity.
Gleick's use of Marshall McLuhan was interesting, in that McLuhan merely popped in frequently, rather than having his own spot in the book. This was disappointing, as McLuhan is an interesting character to include when discussing the social impact of information, new media, etc. But at 426 pages (plus the seemingly exhaustive index and the extensive notes and bibliography), the book probably could not bear any additional close character studies.
This book would be most useful to a reader who wants to learn about information theory in order to understand other, denser work, or to critique other popular information/Internet-related texts. Gleick excels at humanizing abstract concepts.
He should have distinguished between data and information. Information has to be both new and relevant to the sender. Otherwise it just creates information (actually data) overload. A general introduction to some of the paradoxical results of information remains to be written. One example: You have to know the information to assess its value. But why would you pay for something that you then already know? This paradox lies at the heart of the media's struggle and inability to make the user pay for information.
A good but incomplete read that hides its incompleteness from its readers.
I have read and enjoyed James Gleick's previous books, and also enjoyed this volume, a description of the information theory developed by Claude Shannon. The book starts with a history of information storage and transmission, from the redundancy in African drumming, needed to overcome the problems in transmission, to the printing press, which made information more stable, less subject to errors in copying. Shannon developed his theory while working for Bell labs, in order to improve transmission of phone calls. Along the way are described early telegraph codes, and the use of private codes to reduce telegraph key strokes; the first phone directories; the publication of vast compendiums of calculated answers that are now accessible by computer, and the explosion of information in digital storage.
The main hero of the book is Claude Shannon, the engineer who first applied the concept of 'entropy' to information, and showed how this could be used for cryptography as well as for compression - but the book also shows how Samuel Morse had similar ideas when he invented his famous code, and how even the 'talking drums of Africa'¹ employed the same mechanisms of a small code-set complemented by redundancy.
There is also a chapter dedicated to the (at the time) unsung duo of Babbage and Lovelace, where he shows how their work - while impractical at the time - also presaged the same general development.
Another important subject is formal systems, and their inevitable incompleteness - Here he echoes Douglas Hofstadters Gödel, Esher, Bach, but also another book that deserves mentioning,Tor Nørretranders Mærk verden, (In English rendered as The User Illusion).
The book concludes with a chapter on 'information overdose', the case when too much information actually impedes decision-making. Very interesting, as I was listening to Malcolm Gladwells book Blink in parallel; a book that champions the idea that it's more important to consider the right information rather than all the information in order to make the right decision.
I think this book well deserves it's place next to Hofstadters tome.
¹ The tone is occasionally rather chatty, such as when he comments that the 'talking drums' have recently been replaced by mobile phones in less than a generation.
Gleick covers it all here, but the book goes to places I had trouble following. It starts off with a general history of information and covers the African drum as a communication tool and follows with oral communication, written communication, and up until the telegraph. It’s at this point that I had trouble with a little too much detail. He gets into how a telegraph works and the use of electricity, then into code breaking, Babbage’s analytical machine, entropy, quantum physics and string theory. In some parts I had to recall my High School physics and a book I read earlier this year, The Grande Design (dealing with quantum physics and string theory). I couldn’t piece together a cohesive point to this direction. I was under the impression the book was about information, how it is delivered, and how we are overwhelmed by it (which he does cover), but it takes such a big sidetrack that I wasn’t sure if he could pull it all together at the end. He does thankfully, and it does all make sense after finishing the book.
An example of the extreme points is when he examines a book called A Million Random Digits, which is a book with a million random digits in it. It was a book published by the Rand Corporation in an attempt to determine if there is a pattern. He also discusses genes and how those tiny bits may be controlling who you are more than you know.
Although difficult, it’s fascinating. There is a part about how information can never be lost, and that it takes work to actually forget things. In fact, if information could be lost it would upend all of quantum theory, even if you burn a book, you could technically rebuild the information from the ash.
The book is a flood of information in and of itself, but worth the read. I will definitely re-read this one in the future to get many of his points, but his main one is clear. It’s not a problem of losing information, but wading through and filtering all of it to find meaning. How do we find meaning with this flood of information? (It’s a call to the librarian in all of us.)
“This is the challenge that remains, and not just for scientists: the establishment of meaning.” P. 372
“When information is cheap, attention becomes expensive.” P. 410
“It from Bit: Information gives rise to every It—every particle, every field of force, even the space-time continuum itself.” This is another way of fathoming the paradox of the observer: that the outcome of an experiment is affected, or even determined, when it is observed. Not only is the observer observing she is asking questions and making statements that must ultimately be expressed in discrete bits. “What we call reality” Wheeler wrote coyly, “arises in the last analysis from the posing of yes-no questions.” He added “all things physical are information-theoretic in origin, and this is a participatory universe.” P. 10
“The alphabet spread by contagion. The new technology was both the virus and the vector of transmission. It could not be monopolized and it could not be suppressed. Even children could learn these few, lightweight semantically empty letters.” P. 39
“The information has been detached from any person, detached from the person’s experience. Now it lives in the words, little life-support modules.” P. 39
“Language did not function as a storehouse of words, from which users could summon the correct items, preformed. On the contrary, words were fugitive, on the fly expected to vanish again thereafter.” P. 53
(There is an interesting perspective on how our view of the universe affects our language. The illiterate see geometric shapes, not as circle or rectangle, but as ball or door.)
“Telephone books soon represented the most comprehensive listings of, and directories to, human populations ever attempted (They became the thickest and densest of the world’s books—four volumes for London; a 2,600 page tome for Chicago—and seemed a permanent, indispensible part of the world’s information ecology until, they were not. They went obsolete, effectively, at the turn of the 21st century. American telephone companies were officially fazing them out by 2010 in New York; the end of automatic delivery of telephone directories was estimated to save 5,000 tons of paper. “
“Information is not free. Maxwell, Thomson, and the rest had implicitly talked as though knowledge was there for the taking…they did not consider the cost of this information. They could not; for them in a simpler time, it was as if the information belonged to a parallel universe, an astral plan, not linked to the universe of matter and energy, particles and forces whose behavior they were learning to calculate. P 279
“When a jingle lingers in our ears or a fad turns fashion upside down, or a hoax dominates the global chatter for months and vanishes as swiftly as it came, who is master and who is slave?” P. 327
“Now expectations have inverted. Everything may be recorded and preserved, at least potentially: every musical performance; every crime in a shop; elevator, or city street; every volcano or tsunami on the remotest shore; every card played or piece moved in an online game; every rugby scrum and cricket match. Having a camera at hand is normal, not exceptional, something like 500 billion images were capture in 2010…” p. 397
“Overloading of circuits was a fairly new metaphor to express a sensation—too much information—that felt new. It had always felt new. One hungers for books; rereads a cherished few; begs or borrows more; waits at the library door and perhaps, in the blink of an eye, finds oneself in a state of surfeit: too much to read.” P. 401
Major players cross the history of information stage: Babbage, Lovelace, Shannon, Turing, and Shockley (transistor fame). Good illustrations and examples aid understanding of theory; nothing is so prolix as to be inaccessible to the average mind.
The truly big idea I came away with is that information, randomness, and complexity are all the same, as proposed by Kolmogorov. Chaos theory is the analysis of dynamic systems in terms of entropy and information dimension. Chaitin and Kologorov invented algorithmic information theory, the basis for the operation of computers. KC complexity “is to mathematics what entropy is to thermodynamics: the antidote to perfection. Just as we have no perpetual motion machines, there can be no complete formal axiomatic systems.” And “Some mathematical facts are true for no reason. They are accidental, lacking a cause or deeper meaning. Chaos is overburdened with information. That is why it is most interesting. The human animal is good at finding patterns (regularities) in chaos(irregular streams of information) and reducing those patterns to explanations or theories that compresses information. We call this induction. Knowledge is not information. Knowledge is what Chas. H. Bennett calls “logical depth,” which can be restated as the usefulness of a message in any particular domain, or what might be regarded as a recognition of its buried redundancy – parts a receiver could figure out without being told, but only at considerable cost in money, time or computation. This idea seems to underlay the concept of self-organization that we see in living systems, how complex structures develop or bootstrap into existence in Nature. Ultimately information is physical. As Seth Lloyd put it, “To do anything requires energy. To specify what is done requires information.” This is essentially what DNA is – the storage, replicator, and transmitter of information turned into being. A computer.
The book finishes at the dawn of quantum computing and points out that we live in an age deluged with so much information that the important ability for usefulness is searching and filtering. Gleick has written another pop sci book that everyone alive today should read. As important to the lay person’s understanding of the world as was his "Chaos."