The Information: A History, a Theory, a Flood
James Gleick
Winner of the Royal Society Winton Prize for Science Books 2012, the world's leading prize for popular science writing.We live in the information age. But every era of history has had its own information revolution: the invention of writing, the composition of dictionaries, the creation of the charts that made navigation possible, the discovery of the electronic signal, the cracking of the genetic code.In ‘The Information’ James Gleick tells the story of how human beings use, transmit and keep what they know. From African talking drums to Wikipedia, from Morse code to the ‘bit’, it is a fascinating account of the modern age’s defining idea and a brilliant exploration of how information has revolutionised our lives.
THE
INFORMATION
A History
A Theory
A Flood
JAMES GLEICK
Dedication
FOR CYNTHIA
Epigraph
Anyway, those tickets, the old ones, they didn’t tell you where you were going, much less where you came from. He couldn’t remember seeing any dates on them, either, and there was certainly no mention of time. It was all different now, of course. All this information. Archie wondered why that was.
— Zadie Smith
What we call the past is built on bits.
— John Archibald Wheeler
Contents
Title Page (#ud5fe8eac-23da-52ac-8210-60068964516b)
Dedication
Epigraph
Prologue
Chapter 1 – Drums That Talk
Chapter 2 – The Persistence of the Word
Chapter 3 – Two Wordbooks
Chapter 4 – To Throw the Powers of Thought into Wheel-Work
Chapter 5 – A Nervous System for the Earth
Chapter 6 – New Wires, New Logic
Chapter 7 – Information Theory
Chapter 8 – The Informational Turn
Chapter 9 – Entropy and Its Demons
Chapter 10 – Life’s Own Code
Chapter 11 – Into the Meme Pool
Chapter 12 – The Sense of Randomness
Chapter 13 – Information Is Physical
Chapter 14 – After the Flood
Chapter 15 – New News Every Day
Epilogue
Acknowledgments
Notes
Bibliography
Index
Illustration Credits
Also by James Gleick
Copyright
About the Publisher (#litres_trial_promo)
Prologue
The fundamental problem of communication is that of reproducing at one point either exactly or approximately a message selected at another point. Frequently the messages have meaning.
—Claude Shannon (1948)
AFTER 1948, which was the crucial year, people thought they could see the clear purpose that inspired Claude Shannon’s work, but that was hindsight. He saw it differently: My mind wanders around, and I conceive of different things day and night. Like a science-fiction writer, I’m thinking, “What if it were like this?”
As it happened, 1948 was when the Bell Telephone Laboratories announced the invention of a tiny electronic semiconductor, “an amazingly simple device” that could do anything a vacuum tube could do and more efficiently. It was a crystalline sliver, so small that a hundred would fit in the palm of a hand. In May, scientists formed a committee to come up with a name, and the committee passed out paper ballots to senior engineers in Murray Hill, New Jersey, listing some choices: semiconductor triode . . . iotatron . . . transistor (a hybrid of varistor and transconductance). Transistor won out. “It may have far-reaching significance in electronics and electrical communication,” Bell Labs declared in a press release, and for once the reality surpassed the hype. The transistor sparked the revolution in electronics, setting the technology on its path of miniaturization and ubiquity, and soon won the Nobel Prize for its three chief inventors. For the laboratory it was the jewel in the crown. But it was only the second most significant development of that year. The transistor was only hardware.
An invention even more profound and more fundamental came in a monograph spread across seventy-nine pages of The Bell System Technical Journal in July and October. No one bothered with a press release. It carried a title both simple and grand—“A Mathematical Theory of Communication”—and the message was hard to summarize. But it was a fulcrum around which the world began to turn. Like the transistor, this development also involved a neologism: the word bit, chosen in this case not by committee but by the lone author, a thirty-two-year-old named Claude Shannon. The bit now joined the inch, the pound, the quart, and the minute as a determinate quantity—a fundamental unit of measure.