From alphabets to iPhones, humans have experimented with
data storage for millennia. In the modern age, though, information
is beginning to overwhelm the physical world. By James Gleick
;e puzzle of information packing—how to cram knowledge into the tiniest space possible—has fueled technological development at least since
the emergence of Chinese letters ;,;;; years ago. In his new book,
;e Information, journalist James Gleick argues that information is “the
blood and the fuel, the vital principle” of our lives. Delving deep into the
history behind today’s data-driven world, Gleick explores the mysterious drumming language of the African talking drum, whose irregular
rhythms carried messages through the jungles of the Congo. He considers
musical compositions like Johann Sebastian Bach’s ;;th-century “
Well-Tempered Clavier” as data streams that could capture sounds as varied
as wind, cricket chirps, or the clatter of a horse-drawn cart. But for Gleick
the pivotal moment initiating our data-drenched era came in ;;;;, when
mathematician Claude Shannon conceived of the bit as a unit of information. Shannon’s work propelled us headlong into the ;ood of blogs,
emails, tweets, and news updates that shape our lives today.
;; ;;;; ;;; ;;;; ;;;;;;;;; ;;;;;;;;;;;; ;;;;;;;;; ;;; ;;;;;-
tion of a tiny electronic semiconductor, “an amazingly simple device”
that could do anything a vacuum tube could do and more e;ciently.
It was a crystalline sliver, so small that ;;; would ;t in the palm of a
hand. In May scientists formed a committee to come up with a name.
Transistor won out. “It may have far-reaching signi;cance in electronics and electrical communication,” Bell Labs declared in a press release,
and for once the reality surpassed the hype. ;e transistor sparked the
revolution in electronics, setting the technology on its path of miniaturization and ubiquity. But it was only the second-most signi;cant
development of that year. ;e transistor was only hardware.
An invention even more profound and more fundamental came
in a monograph spread across ;; pages of ;e Bell System Technical
Journal in July and October. No one bothered with a press release. It
carried a title both simple and grand—“A Mathematical ;eory of
Communication”—and the message was hard to summarize. But it
was a fulcrum around which the world began to turn. Like the transistor, this development also involved a neologism: the word bit, chosen
in this case not by a committee but by the lone author, a ;;-year-old named Claude Shannon. ;e bit now joined the inch, the pound,
;; ;;;;, ;;;; ;;;;;; ;;;;;;; ;;;; ; sheet of paper and penciled his outline of the measures of information, the scale went from tens of bits to hundreds to thousands, millions, billions, and trillions. ;e transistor was one year old and Moore’s law yet to be conceived. At the top of his information pyr- amid was Shannon’s estimate for the Library of Congress—;;; trillion bits, ;;;;. He was about right, but the pyramid was growing. After bits came kilobits, naturally enough. After all, engineers had coined the word kilobuck—“a scientist’s idea of a short way to say ‘a thousand dollars,’ ” ;e New York Times helpfully explained in ;;;;. The measures of information climbed up an exponential
scale, as the realization dawned in the ;;;;s that everything to do with
information would now grow exponentially. ;at idea was casually
expressed by Gordon Moore, who had been an undergraduate studying chemistry when Shannon jotted his note and found his way to
electronic engineering and the development of integrated circuits. In
;;;;, three years before he founded the Intel Corporation, Moore was
merely, modestly suggesting that within a decade, by ;;;;, we would
be able to combine as many as ;;,;;; transistors on a single wafer of
silicon. He predicted a doubling every year or two—a doubling of the
number of components that could be packed on a chip, but then also,
as it turned out, the doubling of all kinds of memory capacity and processing speed, a halving of size and cost, seemingly without end.
the quart, and the minute as a determinate
quantity—a fundamental unit of measure.
But measuring what? “A unit for measuring information,” Shannon wrote, as
though there were such a thing, measurable and quanti;able information.