Turing's cathedral : the origins of the digital universe

by George Dyson

Paper Book, 2013

Status

Available

Call number

004/.09

Library's review

Indeholder "Preface", "Acknowledgements", "Principal Characters", "1. 1953", "2. Olden Farm", "3. Veblen's circle", "4. Neumann János", "5. MANIAC", "6. Fuld 219", "7. 6J6", "8. V-40", "9. Cyclogenesis", "10. Monte Carlo", "11. Ulam's demons", "12. Barricelli's universe", "13. Turing's cathedral",
Show More
"14. Engineer's dreams", "15. Theory of self-reproducing automata", "16. Mach 9", "17. The tale of the big computer", "18. The thirty-ninth step", "Key to Archival Sources", "Notes", "Index".

"Preface" handler om John von Neumann og Alan Turing og hvordan det var computerne og ikke bomberne, der eksploderede.
"Acknowledgements" handler om tak for hjælpen og om hvordan mængden af overlevende øjenvidner svinder hastigt i disse år.
"Principal Characters" er en lang liste over personer, der optræder i bogen.
"1. 1953" handler om at 1953 var året hvor brintbomberne Ivy Mike og Castle Bravo takket være tidlige computere kom til verden. Og James Watson og Francis Crick knækkede DNA koden, så ATGC blev interessante (Adenine, Thymine, Guanine og Cytosine).
"2. Olden Farm" handler om Institute for Advanced Study og navnlig den store grund, det ligger på og landstykkets historie.
"3. Veblen's circle" handler om Thorstein Veblen og instituttets første tid, Institute of Advanced Salaries.
"4. Neumann János" handler om Johnny von Neumanns opvækst. Og dermed også om hans superskarpe hjerne. Hans første afhandling er axiomatiseringen af mængdelæren. Det sætter ham også lige i centrum for at forstå Gödel og Turing. Og han er mere bange for hvad computernes udvikling vil betyde end for brintbomberne.
Hans arbejder med chokbølger i eksplosioner passer som fod i hose med implosionsbomber (plutonium) og senere fusionsbomber. Og pr definition er atombomber og brintbomber ikke noget man eksperimenterer sig frem til. Det er noget, man beregner på forhånd.
"5. MANIAC" handler om anden verdenskrig og beregning af skydetabeller for nyudviklet skyts. Det går alt for langsom med håndkraft, dvs kalendertid der måles i måneder. Så man koncentrerer kræfterne om at lave elektronisk regnekraft i stedet. Det går over stok og sten, så i løbet af de næste to år har man fået styr på at få en cpu til at køre med elektroniske hastigheder. Hukommelse er derimod ikke let at lave. Billig hukommelse er ikke hurtigt (hulkort i milliontal) og hurtig hukommelse er ikke billigt (radiorør). Delay-line hukommelse i form af lydpulser i kviksølvrør giver et kompromis, men er stadig langt fra ideelt. MANIAC bliver hurtigt sat til at regne på brintbomber (allerede i 1946), fordi von Neumann presser på. I det hele taget er von Neumann en central figur, fordi han sidder i et hav af udvalg og kender både alle folk og alle hemmeligheder. Da det går op for ham at folkene med skydetabellerne har gang i noget, der kan lave millioner af multiplikationer i sekundet, bliver han fyr og flamme.
Under krigen har man vedtaget at glemme alt om personlig nid og patentstridigheder, men det bryder ud med det samme, da krigen er slut. UNIVAC folkene føler sig grundigt snydt. Mange af de tidlige computere bliver overhalet af nye computere, der bruger det, man lærte under planlægningen!
"6. Fuld 219" handler om det kontor på IAS, hvor de begynder at planlægge bygningen af IAS's computer. Ingeniørerne er ikke så velsete i elfenbenstårnet.
"7. 6J6" handler om den masseproducerede triodedims 6J6 som de bruger som hovedingrediens. Von Neumann er ligeglad med at dimserne ikke overholder specifikationerne og siger at de bare må designe noget pålideligt ud af upålidelige dele. Det gør de så, for de måler på dimserne i stedet for at tro på det ævl, der står i specifikationerne. Og det virker faktisk fint.
De bliver også efterhånden accepterede blandt humanisterne og teoretikerne på IAS, fx fordi de er gode til at lave tv-antenner og hi-fi radioer.
"8. V-40" handler om faconen på maskinen. 40 katodestrålerør stak ud fra siden, så det lignede en 40 cylinders motor i V form. Hvert rør kan huske 32 x 32 = 1024 bit og de accesser dem i parallel, så de kan hente et 40-bits ord lige så hurtigt som en enkelt bit. En anden gruppe forsøger at lave det perfekte radiorør, men det bliver alt for dyrt. Bigelows gruppe bruger standard rør og lever med at de ikke er perfekte. De forstærker signalet 30000 gange inden de bestemmer sig for om det er et 0 eller 1 og det er en af Bigelow's regler for at bygge computere at man skal filtrere støj fra signalet så tæt på kilden som muligt. Smart mand, ham Bigelow.
"9. Cyclogenesis" handler om vejrforudsigelser. Det tager dem 13 dage at regne en 12-timers forudsigelse ud i første omgang, men pyt med det, for de ved at maskinerne bliver hurtigere og hurtigere. De bruger hulkort til at lagre mellemresultater på, så det kan også kun blive hurtigere. I 1960 løb maskinerne fra menneskelige meteorologers forudsigelser og for hvert tiår er der løbet ca 24 timer ekstra på i nøjagtige forudsigelser. Forløberen for klimamodeller blev kørt i 1954 for første gang og bygger på Lewis Fry Richardson: Weather Prediction by Numerical Process fra 1922 og udtænkt under første verdenskrig.
"10. Monte Carlo" handler om Ulams ide med simulering af komplicerede beregninger i stedet for at få matematikerne til at finde en lukket formel (som måske slet ikke findes). Det er vældigt nyttigt og metoden breder sig. John von Neumann finder også på at gemme programmerne i computeren i stedet for at håndkode programmet vha kontakter og kabler. Alt går pludseligt meget hurtigere. Ulam simulerer neutroner i kædereaktioner og Johns kone har tidligere simuleret folkebevægelser, dvs emigration og immigration, dødsfald og fødsler. Set med de rette briller, er der ingen forskel på personer og neutroner. De begynder også at regne på om brintbomber kan lade sig gøre. Hippo koden kører i uger ad gangen.
"11. Ulam's demons" handler om at Edward Teller vil lave brintbomber, men ikke kan få det til at virke. Stanislaw Ulam får en helt ny ide i februar 1951 til at få det til at ske og alle, selv modstandere af brintbomben som Robert Oppenheimer bliver begejstrede og finder ideen så god at man er nødt til at bygge bomben fordi det rent fysisk er så sexet. 29 august 1949 eksploderede russernes første bombe, First Lightning eller Joe-1, men det var ikke nok til at sætte skub i amerikanernes brintbombe.
De regner også på slingshot baner for rumfartøjer ind imellem regnerier på brintbomber. Den første testbombe Mike-1 er designet meget konservativt, så de er sikre på at den virker og det gør den i den grad. Den leverer 10 megaton og en af de F-84 jagere, der flyver gennem skyen halvanden time efter eksplosionen kommer ikke hjem igen.
"12. Barricelli's universe" handler om simulering af evolution. De leger også med at regne røntgendiffraktionsmønstre ud og derved gætte på proteinstrukturer.
"13. Turing's cathedral" handler om Turing, Gödel og Hilberts tre problemer om matematikkens grundlag. I 1936 slår Turings "On Computable Numbers" ligkisten endeligt i for Hilberts program. Freeman Dyson læste artiklen i 1942 og forestillede sig ikke at det nogensinde kunne bruges til noget praktisk. Alonso Church kaldte maskinerne for Turing-maskiner for første gang. Turing og von Neumann havde kontorer lige ved siden af hinanden på Princeton. Turing har dybe tanker over maskinintelligens og tilfældighed.
"14. Engineer's dreams" handler om Bigelow og skuffelsen over at IAS ikke laver flere computere, men lader det hele gå over til IBM. 9 july 1955 kollapser von Neumann, mens han snakker i telefon med Lewis Strauss. Det viser sig at være kræft i kravebenet og der er metastaser. Han bliver hasteopereret. I januar 1956 bliver han lænket til en rullestol og i marts bliver han indlagt på John Reeds, hvor han tilbringer sine sidste 11 måneder. Han får fint besøg og flere spekulerer over det bemærkelsesværdige i at en emigrant får besøg af forsvarsministeren og værnscheferne. Hans datter Marina von Neumann er på det tidspunkt 22 år og skal giftes, men John er ved at glide bort både fysisk og mentalt og er klar over det. Han konverterer til den katolske kirke, hvilket er svært for nogle af hans venner at forlige sig med. Han dør den 8 februar 1957 og bliver begravet i Princeton den 12.
Bigelow ser i 1965 tilbage på udviklingen og er irriteret over superhurtige computere, som alligevel kun gør en ting ad gangen, dvs det meste af computeren står typisk og venter.
"15. Theory of self-reproducing automata" handler om blandt andet Huxley's Ape and Essence. von Neumanns ideer kom før Franklin, Watson og Crick. Teller var den mest sejlivede af de fem marsmænd John von Neumann, Edward Teller, Léo Szilárd, Eugene Wigner og Theodore von Kármán. I 1928 fik han det meste af en fod kørt af af en sporvogn i Wien og i 2000 haltede han stadig på grund af det. Dyson interviewede ham for at høre om Fermi og om liv udenfor solsystemet.
"16. Mach 9" handler om at lyslederkabel på et tidspunkt i 1991 blev lagt ud med en fart som svarede til at man lagde et kabel ud med 9 x lydens hastighed.
"17. The tale of the big computer" handler om Hannes Alfvèn og Sagaen om Den store Datamaskine.
"18. The thirty-ninth step" handler om det niogtredivte skridt i en beregning, hvor man skal bestemme hvordan man runder af. (For de brugte jo 40-bit ord i en af de første computere).
"Key to Archival Sources" er en kort liste over forkortelser for arkivsamlinger, fx GBD for forfatterens egen samling.
"Notes" er noteapparatet for bogen, som er glimrende til at pege på originalkilder og yderligere læsning.
"Index" er et glimrende opslagsregister.

Fremragende bog, som giver gåsehud at læse. Fx at George Dyson som 3-årig rendte rundt på Institute of Advanced Studies i Princeton og vidste at ham i det store hus derovre (Robert Oppenheimer) har lavet atombomben og at ders gode ven Julian Bigelow havde bygget en computer. Eller da George som 3-årig finder en drivrem til en bilkøler og spørger sin far (Freeman Dyson) hvad det er og får svaret "Et stykke af solen". Jeg havde ikke tænkt over at allerede de første computere kørte på mikrosekund niveau og at lageret var flaskehalsen, der egentlig stadig er der. Nok bruger vi ikke hulkort længere, men der er heller ikke meget ved fx en 100 gigabit linie, hvis data er i exabytestørrelse.

Et tiår, hvor den elektroniske computer, fission og fusion, strukturen af dna og vejrforudsigelser pr computer bliver født. Wow!
Show Less

Publication

London [u.a.] Penguin Books 2013

Description

"Legendary historian and philosopher of science George Dyson vividly re-creates the scenes of focused experimentation, incredible mathematical insight, and pure creative genius that gave us computers, digital television, modern genetics, models of stellar evolution--in other words, computer code. In the 1940s and '50s, a group of eccentric geniuses--led by John von Neumann--gathered at the newly created Institute for Advanced Study in Princeton, New Jersey. Their joint project was the realization of the theoretical universal machine, an idea that had been put forth by mathematician Alan Turing. This group of brilliant engineers worked in isolation, almost entirely independent from industry and the traditional academic community. But because they relied exclusively on government funding, the government wanted its share of the results: the computer that they built also led directly to the hydrogen bomb. George Dyson has uncovered a wealth of new material about this project, and in bringing the story of these men and women and their ideas to life, he shows how the crucial advancements that dominated twentieth-century technology emerged from one computer in one laboratory, where the digital universe as we know it was born"-- "Legendary historian and philosopher of science George Dyson vividly re-creates the scenes of focused experimentation, incredible mathematical insight, and pure creative genius that gave us computers, digital television, modern genetics, models of stellar evolution--in other words, computer code"--… (more)

User reviews

LibraryThing member Widsith
A fascinating and illuminating book, but also a frustrating one because it should have been a lot better than it is.

The heart of the story is more or less on target – a collection of very interesting anecdotes and narratives about the personalities involved in building America's first computer,
Show More
at Princeton's Institute for Advanced Study after the Second World War. Leading the team was the quite extraordinary figure of John von Neumann, about whom I knew rather little before reading this. He comes across as by far the most brilliant mind in these pages (not excluding the presence of one A. Einstein), with a near-eidetic memory, an ability to understand new concepts instantly and make staggering leaps of reasoning to advance them further. Not a very endearing character, though – a refugee from 1930s Europe, he pushed the nuclear programme hard and argued to the end of his life that the best way to create world peace was to launch a full ‘preventive’ hydrogen bomb strike against the USSR, mass civilian deaths and all.

The nuclear project was central to the invention of the computer. The first incarnation of the machine (the ‘ENIAC’, later nicknamed ‘MANIAC’) was developed specifically to model fission reactions, which involve some rather tricky maths. But von Neumann and other thinkers realised early on that a machine capable of doing that would also be able to fulfil Alan Turing's description of a ‘universal computer’: if it could do the arithmetic, it turned out, it could do practically anything else too, provided there was a way of feeding it instructions.

‘It is an irony of fate,’ observes Françoise Ulam, ‘that much of the hi-tech world we live in today, the conquest of space, the extraordinary advances in biology and medicine, were spurred on by one man's monomania and the need to develop electronic computers to calculate whether an H-bomb could be built or not.’

What is particularly fascinating is how these studies gradually let to a blurring of the distinction between life and technology. The development of computing coincided with the discovery of DNA, which showed that life is essentially built from a digital code (Nils Barricelli described strings of DNA as ‘molecule-shaped numbers’), and early programmers soon discovered that lines of code in replicating systems would display survival-of-the-fittest type phenomena. This is entering a new era with the advent of cloud-sourcing and other systems by which computing is, in effect, becoming analog and statistics-based again – search engines are a fair example.

How can this be intelligence, since we are just throwing statistical, probabilistic horsepower at the problem, and seeing what sticks, without any underlying understanding? There's no model. And how does a brain do it? With a model? These are not models of intelligent processes. They are intelligent processes.

All of this is very bracing and very interesting. Unfortunately the book is let down by a couple of major problems. The first is a compulsion to give what the author presumably thinks is ‘historical colour’ every two minutes – so the IAS is introduced by way of an entire chapter tracing the history of local Algonquin tribes, Dutch traders, William Penn and the Quakers, the War of Independence – all of which is at best totally irrelevant and at worst a fatal distraction.

The second, even more serious failing is that the technology involved remains extremely opaque. One of the most crucial parts of this story should be following the developments of cathode-ray storage, early transistors, the first moves from machine language to codes and programs. But the explanations in here are poor or non-existant. Terms like ‘shift register’, ‘stored-program computer’, ‘pulse-frequency-coded’, are thrown around as though we should all be familiar with them.

My favourite story to do with the invention of the digital world involves Claude Shannon and his remarkable realisation – one of the most civilisation-altering ideas of our species – that electronic transistors could work as logic gates. It's not even mentioned in this book. And so the crucial early building blocks of what a computer actually is – how, on a fundamental level, it really does what it does – that is all missing. And it's a pretty serious omission for someone that finds it necessary to go back to the Civil War every couple of chapters.

A lot of reviews here, especially from more technical experts, really hate this book, but on balance, I'd still recommend it. It's a very important story about a very important period, and the later chapters especially have a lot of great material. But reading around the text will probably be necessary, and this book should have offered a complete package.
Show Less
LibraryThing member sirk.bronstad
Fell flat with me. Not what I was wishing the book to be.
LibraryThing member paulkeller
fascinating account of the the invention of modern stored-program computers during the 1940s. The story centers on John van Neumanm and his team at the Institute for Advanced Studies in Princeton, NJ and tracks how the development of nuclear weapons lead to the development of modern computers and
Show More
the other way around. The story makes a strong argument for open innovation attributing much of the speed of the progress in these years to the fact that the war effort created a climate wherein patenting of individual inventions was not an issue. In addition the hardcopy edition has one of the most beautiful book covers ever, paying hommage to both the punchcard and Alan Turing.
Show Less
LibraryThing member pierthinker
Like many technologies and 'inventions', the modern digital computer has more than one origin and more than one set of roots. Dyson here sets out the development of the first digital computers in the USA by concentrating on the personalities and insights that drove these developments rather than on
Show More
the technologies themselves (although there is enough explanation of these to drive the excitement and tension in the story). Dyson shows how the convergence of two great waves pushed forwards the digital age. The first was the desire of Princeton University to attract the brightest and the best from all over the world; this brought more engineers than they would have liked but created an environment where ideas and theories and their application melded quickly. The second wave was World War II and the need to develop an atomic bomb before the Nazis did. This is an exciting story well told and the fact that it fizzled out after the War rather than leading a charge towards more and better computing power shows the shortsightedness of the establishment rather than any failing of the author. It was the crossover into the commercial world that finally drove the digital computer forwards to the ubiquitous technology it is today.
Show Less
LibraryThing member ivanfranko
Tough going if you don't know much about computing. Most science frightens the hell out of me these days. This book has some interesting speculation about the march to mechanical intelligence. Completing the book induced a pervading depression for me.
LibraryThing member encephalical
Despite the title and the photograph on the cover, Turing makes limited appearances and has only one devoted chapter. Mainly about von Neumann and the IAS. Moves between biography and history to editorial and speculation. I often found myself losing the story thread. Would recommend only if
Show More
somewhat familiar with the early history of computing. If completely new would recommend looking elsewhere, perhaps Rhodes' The Making of the Atomic Bomb and Dark Sun.
Show Less
LibraryThing member Paul_S
Title should read "Von Neumann machines: The human stories". You're welcome dear publisher.

I hope you've already read and know the history of the creation of the computer because this book dispenses with all that nonsense and instead concentrates on absolutely inconsequential trivia. Parties,
Show More
social interactions, immigration issues, divorces, building houses, fixing cars and all kinds of irrelevant waffle that is day to day life pushing to the side exploding nuclear bombs.

It's nice to see the background to the revolution and I appreciate it but it's like a misfocused photo where the face is completely blurry but by god that concrete wall behind the subject has razor sharp detail. I cannot but feel frustrated every time the author wraps up the technical side with a glib and borderline misleading paragraph (as it's glossing over any and all details) only to waste the rest of the chapter recounting sleeping arrangements and letter writing. I care about those things too but not that much.
Show Less
LibraryThing member PDCRead
Three quarters of a century ago a small number of men and women gathered in Princeton, New Jersey. Under the direction of John von Neumann they were to begin building one of the world’s first computers driven by the vison that Alan Turing had of a Universal machine. Using cutting edge technology,
Show More
valves and vacuum tubes to store the data, the first computer was born. This unit took 19.5kW to work and had a memory size of five, yes five kilobytes. It caused a number of revolutions, it was this machine that laid the foundations for every single computing device that exists on the planet today, it changed the way that we think about numbers and what they could do for us and the calculations that it ran gave us the hydrogen bomb…

I had picked this up mostly because of the title, Turing's Cathedral, thinking that it would be about that great man, the way that he thought and the legacy that he left us with regards to computing and cryptography. There was some of the on Turing and his collaboration with the American computer scientists and engineers through the war, but the main focus was on the development of the computer in America and the characters that were involved in the foundation of today’s technological society. Some parts were fascinating, but it could be quite tedious at times. There were lots and lots of detail in the book, the characters and political games that they were playing and subject to, not completely sure why we needed to go so far back in time on the origins of Princeton. Definitely one for the computer geek, not for the general reader.
Show Less
LibraryThing member nillacat
This is a nicely written social and technical history of the computer project at the Institute for Advanced Study, starting with the founding of the IAS, treating the lives and personalities of the many fascinating people involved, the history of mathematics and logic in the early 20th century, the
Show More
engineering developments that led to the computer, the interwoven histories of the computer and of the atomic and hydrogen bombs, the effect of the computer on weather prediction and genetics, the inevitable politics and the quirks and foibles and human failures and human successes. The history is solid and well-documented, and this is above all a book about people and ideas and a great accomplishment.
Show Less
LibraryThing member kaulsu
I am positive this is a 5 star book for those with the mathematics and science I lack. I, being more of a social scientist, was interested in the persons involved and the time period in question, found the circularity of the unfolding of the drama (now it is 1937, then it becomes 1953, now it is
Show More
back at 1946...) really difficult to hold onto.

This is a book that (imho) needs to be read in a print version. One needs to be able to flip back and forth to grasp it all. Or at least I would need to if I were to read it again!
Show Less
LibraryThing member jeroenvandorp
George Dyson's book (2012) is about the origins of the modern computer, developed in a time when a computer wore skirts. A 'computer' was a woman with an adder, a calculation machine. They were working in teams to create tables for -say - grenade launchers, to calculate the right trajectory for any
Show More
given configuration.

The machine Dyson writes about was a different kind. The kind with vacuum tubes and wires. A Turing Machine like a walk-in closet, a cathedral of calculation. He recalls the history of computers with names like ENIAC and MANIAC. The brainchildren not only of Babbage and Turing, but most of all of John von Neumann. MANIAC made calculations about the weather, the evolution of species and for tables used by anti-aircraft gunners, but it was famous (or maybe notorious) because of the calculations it did on the H-Bomb: neutrino behavior, shockwaves. That all in a computer which was special because it was the first to be equipped with Random Access Memory (RAM).

The book recalls the origin of the place where Von Neumann worked on his love child, IAS (Institute for Advanced Studies) in Princeton, and tells the story about the many persons involved. These people included Stanislaw Ulam, who developed the famous Monte Carlo statistics approach, and engineer Julian Bigelow, who could make a space rocket from two empty jerrycans and a wooden plank.
The story shows how intertwined the scientific effort was with the war effort, which regularly led to friction. It also gives a great account of the development, its ups as well as its downs, of the IAS.

The book is meticulously documented. No detail is left out. It can overwhelm you. Added to that George Dyson jumps forwards and backwards through time; as soon as a new character is introduced he starts all over again: 'X was born in Y. His father was Z, a simple farmer from ...' It makes the account even more hard to follow.
At the end of the book I had the impression that I knew a lot about everyone and everything, but not about the exact order in which it all happened. My guess is that George Dyson, son of Freeman Dyson, is too much of an insider. He knew every one of them and can place them exactly in their respective contexts. Not so for the average reader. In that respect it's not so much a book for specialists (as everything is well explained) but for people who are willing to take the time to learn the cast of characters, even if it means to stop reading and revisit the whole list. Some organization schedules of the IAS through the years might have been a good addition.

Overall the subject is interesting enough and its prose captivating enough to make it a 'four out of five stars' book. Even if Georges Dyson's story requires a lot of human brain RAM to process effectively.
Show Less
LibraryThing member SChant
Couldn't get into it - the style of writing was very irritating. Gave up after 3 chapters.
LibraryThing member tlockney
This would have been better titled von Nuemann's Kingdom, but I'm sure the question of whether or not to capitalize the V in 'von' would have driven them crazy. There's actually not all that much on Turing himself or his work, really. Still, it's a good book and should be of interest to those of
Show More
you who enjoy learning more about the history of computing and the circumstances in which much of the work around it began.
Show Less
LibraryThing member TerriBooks
It was a bit long and tended to drag in the middle but finished up with a rousing end. As a person who has worked with computers since the early 70s, I found it overlapped a tiny bit with my experiences. The most interesting part was the reflections on just what the "digital universe" may be;
Show More
certainly a different perspective than my totally utilitarian approach to computers and programming.
Show Less
LibraryThing member fpagan
In large part a nonlinear biography of John von Neumann and history of the Institute for Advanced Study in Princeton, where von Neumann in the late 1940s did his influential-ever-after "architecting" of the MANIAC computer. Woven in are discussions of many relevant topics such as Gödel/Turing
Show More
metamathematics, early ways of programming, Monte Carlo approximation, the theory of self-reproducing automata, and today's accelerating trend towards a compu-singularity. (Why not mention the obliteration of privacy, Mr Dyson?) Overwhelming everything, however, is the dreary -- nay, sick and ghastly -- fact that nuclear weaponry and other military evils were the main driving force behind the building of the first electronic digital computers with Turing universality. A powerful, discerning, penetrating book.
Show Less
LibraryThing member Katong
Meandering and portentous but very much worth reading…
LibraryThing member neurodrew
Turing's Cathedral: The Origins of the Digital Universe
George Dyson
April 1, 2013

George Dyson is interestingly the son of Freeman Dyson, who was part of the events chronicled in the book. The author describes the creation of the first electronic computer, the MANIAC, at the Institute for Advanced
Show More
Study at Princeton, starting in 1949 and completing in 1953. The impresario of the project was John von Neumann, who gathered engineers to build the computer and helped to design the first programming language. Much of the impetus for the computer was to complete calculations for the hydrogen bomb, then in development. The mathematicians and physicists involved were mostly also involved in the atomic bomb project in Los Alamos. The computer used vacuum tubes, and the filament heaters consumed several kilowatts of power, and the air conditioning to keep the apparatus cool also used gross amounts of power, often icing over in the humidity. The memory was about 5 kilobytes, stored in Williams' cathode-ray storage tubes (the persistent phosphor glow allowed bits to be retained, and read out by the electron beam). The engineers were the first to develop a command line, and read instructions into the machine with paper tape, later punch cards. The input and output followed the same patterns as earlier special purpose machines like those at Bletchely Park in England during WWII, and ENIAC created for calculating artillery tables. This is a fascinating time in engineering history, but the story is very liberally padded with irrelevant information, like the history of Princeton in Indian and Colonial times. Dyson at times speculates about the "digital universe" and its relationship to human thought: "With our cooperation, self-reproducing numbers are exercising increasingly detailed and far-reaching control over the conditions in our universe that make life more comfortable in theirs". "The paradox of artificial intelligence is that any system simple enough to understand is not complicated enough to behave intelligently, and any system complicated enough to behave intelligently is not simple enough to understand." It is interesting that random searches may be more efficient on large machines than encoding a solution to a problem, and by looking through the number of solutions that have already been encoded in the digital universe, it may be easier to find answers. "In 2010 you could buy a computer with over a billion transistors for the inflation adjusted cost of a transistor radio in 1956"
Show Less
LibraryThing member rakerman
Starts with a brief intro to one of the first digital computers but then goes into a long history of the founding of the Institute for Advanced Study.
LibraryThing member tuckerresearch
A very detailed and dragging history of the early computers, told through the lens of Alan Turing, John von Neumann (especially him), and the Institute for Advanced Study in Princeton. Less breezy and interesting than, say, Gleick's The Information, which cover some of the same ground. The capsule
Show More
biographies are neat, but the sheer number of "characters" begins to confuse. The detail of early computer structure, programming, and usage is boggling. Still, as a historian I appreciated some of the impact these folks have and how it fit into the history of the time. If I was more oriented toward engineering and computer science, and had appreciable skills in such areas, maybe I would find it way more interesting and smooth. Still, an important book for understanding our computer-oriented world.
Show Less

Awards

LA Times Book Prize (Finalist — Science & Technology — 2012)
Notable Books List (Nonfiction — 2013)
Globe and Mail Top 100 Book (Nonfiction — 2012)

Language

Original language

English

Original publication date

2012

Physical description

XXII, 401 p.; 19.6 cm

ISBN

9780141015903

Local notes

Omslag: Paul Catherall
Omslaget viser en stiliseret computer af ældre model
Indskannet omslag - N650U - 150 dpi
Side v: It was not made for those who sell oil or sardines ... - G. W. Leibniz
Side ix: I am thinking about something much more important than bombs. I am thinking about computers. - John von Neumann, 1946
Side ix: There are two kinds of creation myths: those where life arises out of the mud, and those where life falls from the sky. In this creation myth, computers arose from the mud, and code fell from the sky.
Side xiv: On our expeditions into the woods, we ignored birds and mammals, hunting for frogs and turtles that we could capture with our bare hands. It was still the age of reptiles to us. The dinosaurs of computing, in contrast, were warm-blooded, but the relays and vacuum tubes we extracted from their remains had already given up their vital warmth.
Side xi: On Computale Numbers, with an Application to the Entscheidungsproblem.
Side 3: If it's that easy to create living organisms, why don't you create a few yourself? - Nils Aall Barricelli, 1953
Side 3: Any difference that makes a difference.
Side 3: To a digital computer, the only difference that makes a difference is the difference between a zero and a one.
Side 4: In March of 1953 there were 53 kilobytes of high-speed random-access memory on planet Earth. Five kilobytes were at the end of Olden Lane, 32 kilobytes were divided among the eight completed clones of the Institute for Advanced Study's computer, and 16 kilobytes were unevenly distributed across a half dozen other machines. Data, and the few rudimentary programs that existed, were exchanged at the speed of punched cards and paper tape. Each island in the new archipelago constituted a universe unto itself.
Side 5: Von Neumann set out to build a Universal Turing Machine that would operate at electronic speeds.
Side 5: 'High speed' meant that the memory was accessible at the speed of light, not the speed of sound. It was the removal of this constraint that unleashed the powers of Turing's otherwise impractical Universal Machine.
Side 5: Electronic components were widely available in 1945, but digital behaviour was the exception to the rule. Images were televised by scanning them into lines, not breaking them into bits. Radar delivered an analog display of echoes returned by the continuous sweep of a microwave beam. Hi-fi systems filled postwar living rooms with the warmth of analog recordings pressed into vinyl without any losses to digital approximation being introduced. Digital technologies - Teletype, Morse code, punched card accounting machines - were perceived as antiquated, low fidelity, and slow. Analog ruled the world.
Side 40: We are Martians who have come to Earth to change everything - and we are afraid we will not be so well received. So we try to keep it a secret, try to appear as Americans ... but that we could not do, because of our accent. So we settled in a country nobody has ever heard about and now we are claiming to be Hungerians. -- Edward Teller
Side 48: He couldn't tell really very good people from less good people. I guess they all seemed so much slower.
Side 64: Let the whole outside world consist of a long paper tape. -- John von Neumann, 1948
Side 221: A thought that had first crossed Ulam's mind while staring out into the garden less than three years previously had now removed the entire island of Elugelab from the map.
Side 243: The history of digital computing can be divided into an Old Testament whose prophets, led by Leibniz, provided the logic, and a New Testament, whose prophets, led by von Neumann, built the machines. Alan Turing arrived in between.

Pages

XXII; 401

Library's rating

Rating

½ (149 ratings; 3.5)

DDC/MDS

004/.09
Page: 0.2901 seconds