THE FIRST PROGRAMMER WAS A LADY by Howard Rheingold
Over a hundred years before a monstrous array of vacuum tubes
surged into history in an overheated room in Pennsylvania, a properly attired
Victorian Gentleman demonstrated an elegant little mechanism of wood and brass
in a London drawing room. One of the ladies attending this demonstration
brought along the daughter of a friend. She was a teenager with long dark
hair, a talent for mathematics, and a weakness for wagering on horse races.
When she took a close look at the device and realized what this older
gentleman was trying to do, she surprised them all by joining him in an
enterprise that might have altered history, had they succeeded.
Charles Babbage and his
accomplice, Lady
Lovelace, came very close to inventing the computer more than a century
before American engineers produced ENIAC. The
story of the "Analytical Engine" is a tale of
two extraordinarily gifted and ill-fated British eccentrics whose
biographies might have been fabrications of Babbage's friend Charles
Dickens, if Dickens had been a science-fiction writer. Like many
contemporary software characters, these computer pioneers of the Victorian
age attracted as much attention with their unorthodox personal lives as they
did with their inventions.
One of Babbage's biographies is entitled Irascible Genius.. He was
indeed a genius, to judge by what he planned to achieve as well as what he
did achieve. His irascibility was notorious. Babbage was thoroughly British,
stubbornly eccentric, tenaciously visionary, sometimes scatterbrained, and
quite wealthy until he sank his fortune into his dream of building a
calculating engine.
Babbage invented the cowcatcher--that metal device on the front of steam
locomotives that sweeps errant cattle out of the way. He also devised a
means of analyzing entire industries, a method for studying complex systems
that became the foundation of the field of operational research a
hundred years later. When he applied his new method of analysis to a study
of the printing trade, his publishers were so offended that they refused to
accept any more of his books.
Undaunted, he applied his new method to the analysis of the postal system
of his day, and proved that the cost of accepting and assigning a value to
every piece of mail according to the distance it had to travel was far more
expensive than the cost of transporting it. The British Post Office boosted
its capabilities instantly and economically by charging a flat rate,
independent of the distance each piece had to travel--the "penny post" that
persists around the world to this day.
[ Top ]
Babbage devised the first speedometer for railroads, and he published the
first comprehensive treatise on actuarial theory (thus helping to create the
insurance industry). He invented and solved ciphers and made skeleton keys
for "unpickable locks"--an interest in cryptanalysis that he shared with
later computer builders. He was the first to propose that the weather of
past years could be discovered by observing cycles of tree rings. And he was
passionate about more than a few crackpot ideas that history has since
proved to be nothing more than crackpot ideas.
His human relationships were as erratic as his intellectual adventures,
to judge from the number of lifelong public feuds Babbage was known to have
engaged in. Along with his running battles with the Royal Societies, Babbage
carried on a long polemic against organ-grinders and street musicians.
Babbage would write letters to editors about street noise, and half the
organ-grinders in London took to serenading under Babbage's window when they
were in their cups. One biographer, B. V. Bowden, noted that "It was the
tragedy of the man that, although his imagination and vision were unbounded,
his judgment by no means matched them, and his impatience made him
intolerant of those who failed to sympathize with his projects."
Babbage dabbled in half a dozen sciences and
traveled with a portable laboratory. He was also a supreme
nit-picker, sharp-eyed and cranky, known to write outraged letters to
publishers of mathematical tables, upbraiding them for obscure inaccuracies
he had uncovered in the pursuit of his own calculations. A mistake in
navigational table, after all, was a matter of life and death for a
seafarer. And a mistake in a table of logarithms could seriously impede the
work of a great mind such as his own.
His nit-picking indirectly led Babbage to invent the ancestor of today's
computers. As a mathematician and astronomer of no small repute, he resented
the time he had to spend poring over logarithm tables, culling all the
errors he knew were being perpetuated upon him by "elderly Cornish
Clergymen, who lived on seven figure logarithms, did all their work by hand,
and were only too apt to make mistakes."
Babbage left a cranky memoir entitled Passages from the Life of a
Philosopher--a work described by computer pioneer Herman Goldstine
as "a set of papers ranging from the sublime to the ridiculous, from
profundities to nonsense in plain bad taste. Indeed much of Babbage's career
is of this sort. It is a wonder that he had as many good and loyal friends
when his behavior was so peculiar."
[ Top ]
In Passages, Babbage noted this about the original inspiration for
his computing machines:
The earliest idea that I can trace in my own mind of calculating
arithmetical tables by machinery rose in this manner: One evening I was
sitting in the rooms of the Analytical society at Cambridge, my head
leaning forward on the table in a kind of dreamy mood, with a Table of
logarithms lying open before me. Another member, coming into the room, and
seeing me half asleep, called out, "Well, Babbage, what are you dreaming
about?" To which I replied, "I am thinking that all these Tables (pointing
to the logarithms) might be calculated by machinery."
In 1822, Babbage triumphantly demonstrated at the Royal Astronomical
Society a small working model of a machine, consisting of cogs and wheels
and shafts. The device was capable of performing polynomial equations by
calculating successive differences between sets of numbers. He was awarded
the society's first gold medal for the paper that accompanied the
presentation.
In that paper, Babbage described his plans for a much more ambitious
"Difference Engine." In 1823, the British government awarded him the first
of many grants that were to continue sporadically and controversially for
years to come. Babbage hired a master machinist, set up shop on his estate,
and began to learn at first hand how far ahead of his epoch's technological
capabilities his dreams were running.
The Difference Engine commissioned by the British government was quite a
bit larger and more complex than the model demonstrated before the Royal
Astronomical Society. But the toolmaking art of the time was not yet up to
the level of precision demanded by Babbage's design. Work continued for
years, unsuccessfully. The triumphal demonstration at the beginning of his
enterprise looked as if it had been the high point of Babbage's career,
followed by stubborn and prolonged decline. The British government finally
gave up financing the scheme.
Babbage, never one to shy away from conflict with unbelievers over one of
his cherished ideas, feuded over the Difference Engine with the government
and with his contemporaries, many of whom began to make sport of mad old
Charley Babbage. While he was struggling to prove them all wrong, he
conceived an idea for an even more ambitious invention. Babbage, already
ridiculously deep in one visionary development project, began to dream up
another one. In 1833 he came up with something far more complex than the
device he had failed to build in years of expensive effort.
If one could construct a machine for performing one kind of calculation,
Babbage reasoned, would it be possible to construct a machine capable of
performing any kind of calculation? Instead of building many small
machines to perform different kinds of calculation, would it be possible to
make the parts of one large machine perform different tasks at
different times, by changing the order in which the parts interact?
[ Top ]
Babbage had stumbled upon the idea of a
universal calculating machine, an idea that was to have momentous
consequences when Alan Turing--another brilliant, eccentric British
mathematician who was tragically ahead of his time--considered it again in
the 1930s. Babbage called his hypothetical master calculator the "Analytical
Engine." The same internal parts were to be made to perform different
calculations, through the use of different "patterns of action" to
reconfigure the order in which the parts were to move for each calculation.
A detailed plan was made, and redrawn, and redrawn once again.
The central unit was the "mill," a calculating engine capable of adding
numbers to an accuracy of 50 decimal places, with speed and reliability
guaranteed to lay the Cornish clergymen calculators to rest. Up to one
thousand different 50-digit numbers could be stored for later reference in
the memory unit Babbage called the "store." To display the result, Babbage
designed the first automated typesetter.
Numbers could be put into the store from the mill or from the
punched-card input system Babbage adapted from French weaving machines. In
addition, cards could be used to enter numbers into the mill and specify the
calculations to be performed on the numbers as well. By using the cards
properly, the mill could be instructed to temporarily place the results in
the store, then return the stored numbers to the mill for later procedures.
The final component of the Analytical Engine was a card-reading device that
was, in effect, a control and decision-making unit.
A working model was eventually built by Babbage's son. Babbage himself
never lived to see the Analytical Engine. Toward the end of his life, a
visitor found that Babbage had filled nearly all the rooms of his large
house with abandoned models of his engine. As soon as it looked as if one
means of constructing his device might actually work--Babbage thought of a
new and better way of doing it.
The four subassemblies of the Analytical Engine functioned very much like
analogous units in modern computing machinery. The mill was the analog of
the central processing unit of a digital computer and the store was the
memory device. Twentieth-century programmers would recognize the printer as
a standard output device. It was the input device and the control unit,
however, that made it possible to move beyond calculation toward true
computation.
[ Top ]
The input portion of the Analytical Engine
was an important milestone in the history of programming. Babbage
borrowed the idea of punched-card programming from the French inventor
Jacquard, who had triggered a revolution on the textile industry by
inventing a mechanical method of weaving patterns in cloth. The weaving
machines used arrays of metal rods to automatically pull threads into
position. To create patterns, Jacquard's device interposed a stiff card,
with holes punched in it, between the rods and the threads. The card was
designed to block some of the rods from reaching the thread on each pass;
the holes in the card allowed only certain rods to carry threads into the
loom. Each time the shuttle was thrown, a new card would appear in the path
of the rods. Thus, once the directions for specific woven patterns were
translated into patterns of holes punched into cards, and the cards were
arranged in the proper order to present to the card reading device, the
cloth patterns could be preprogrammed and the entire weaving process could
be automated.
These cards struck Babbage as the key to automated calculation. Here was
a tangible means of controlling those frustratingly abstract "patterns of
action": Babbage put the step-by-step instructions for complicated
calculations into a coded series of holes punched into the sets of cards
that would change the way the mill worked at each step. Arrange the
correctly coded cards in the right way, and you've replaced a platoon of
elderly Cornish gentlemen. Change the cards, and you replace an entire army
of them.
During his crusade to build the devices that he saw in his mind's eye but
was somehow never able to materialize in wood and brass, Babbage met a woman
who was to become his companion, colleague, conspirator, and defender. She
saw immediately what Babbage intended to do with his Analytical Engine, and
she helped him construct the software for it. Her work with Babbage and the
essays she wrote about the possibilities of the engine established Augusta
Ada Byron, Countess of Lovelace, as a patron saint if not a founding parent
of the art and science of programming.
Ada's father was none other than Lord
Byron, the most scandalous character of his day. His separation from
Ada's mother was one of the most widely reported domestic episodes of the
era, and Ada never saw her father after she was one month old. Byron wrote
poignant passages about Ada in some of his poetry, and she asked to be
buried next to him--probably to spite her mother, who outlived her. Ada's
mother, portrayed by biographers as a vain and overbearing Victorian figure,
thought a daily dose of a laudanum-laced "tonic" would be the perfect cure
for her beautiful, outspoken daughter's nonconforming behavior, and thus
forced an addiction on her!
Ada exhibited her mathematical talents early in life. One of her family's
closest friends was Augustus De
Morgan, the famous British Logician. She was well tutored, but always
seemed to thirst for more knowledge than her tutors could provide. Ada
actively sought the perfect mentor, whom she thought she found in a
contemporary of her mother's--Charles Babbage.
[ Top ]
Mrs. De Morgan was present at the historic occasion when the young Ada
Byron was first shown a working model of the Difference Engine, during a
demonstration Babbage held for Lady Byron's friends. In her memoirs, Mrs. De
Morgan remembered the effect the contraption had on Augusta Ada: "While the
rest of the party gazed at this beautiful invention with the same sort of
expression and feeling that some savages are said to have shown on first
seeing a looking glass or hearing a gun, Miss Byron, young as she was,
understood its working and saw the great beauty of the invention."
Such parlor demonstrations of mechanical devices were in vogue among
the British upper classes during the Industrial Revolution. While her elders
tittered and gossiped and failed to understand the difference between this
calculator and the various water pumps they had observed at other
demonstrations, young Ada began to knowledgeably poke and probe various
parts of the mechanism, thus becoming the first computer whiz kid.
Ada was one of the few to recognize that the Difference Engine was
altogether a different sort of device than the mechanical calculators of the
past. Whereas previous devices were
analog (performing calculation by means of measurement),
Babbage's was digital (performing calculation by means of
counting). More importantly, Babbage's design combined
arithmetic and logical functions. (Babbage eventually discovered the new
work on the "algebra of Logic" by De Morgan's friend George Boole--but, by
then, it was too late for Ada.)
Ada, who had been tutored by De Morgan, the foremost logician of his
time, had ideas of her own about the possibilities of what one might do with
such devices. Of Ada's gift for this new type of partially mathematical,
partially logical exercise, Babbage himself noted: "She seems to understand
it better than I do, and is far, far better at explaining it."
At the age of nineteen, Ada married Lord King, Baron of Lovelace. Her
husband was also something of a mathematician, although his talents were far
inferior to those of his wife. The young countess Lovelace continued her
mathematical and computational partnership with Babbage, resolutely
supporting what she knew to be a solid idea, at a time when less-foresighted
members of the British establishment dismissed Babbage as a crank.
Babbage toured the Continent in 1840, lecturing on the subject of the
device he never succeeded in building. In Italy, a certain Count Menabrea in
Italy took extensive notes at one of the lectures and published them in
Paris. Ada translated the notes from French to English and composed an
addendum which was more than twice as long as the text she had translated.
When Babbage read the material, he urged Ada to publish her notes in their
entirety.
[ Top ]
Lady Lovelace's published notes are still understandable today and are
particularly meaningful to programmers, who can see how truly far ahead of
their contemporaries were the Analytical Engineers. Professor B. H. Newman
in the Mathematical Gazette has written that
her observations "show her to have fully understood the principles
of a programmed computer a century before its time."
Ada was especially intrigued by the mathematical implications of the
punched pasteboard cards that were to be used to feed data and equations to
Babbage's devices. Ada's Essay, entitled "Observations on Mr. Babbage's
Analytical Engine," includes more than one prophetic passage, unheeded
by most of her contemporaries, but which have grown in significance with the
passage of a century:
The distinctive characteristic of the Analytical Engine, and that which
has rendered it possible to endow mechanism with such extensive faculties
as bid fair to make this engine the extensive right hand of algebra, is
the introduction into it of the principle which Jacquard devised for
regulating, by means of punched cards, the most complicated patters in the
fabrication of brocaded stuffs. It is in this that the distinction between
the two engines lies. Nothing of the sort exists in the Difference Engine.
We may say most aptly that the Analytical Engine weaves algebraical
patterns just as the Jacquard loom weaves flowers and leaves. . . .
The bounds of arithmetic were, however, outstepped the moment
the idea of applying cards had occurred; and the Analytical Engine does
not occupy common ground with mere "calculating machines." It holds a
position wholly its own; and the considerations it suggests are most
interesting in their nature. In enabling mechanism to combine together
general symbols, in successions of unlimited variety and extent, a
uniting link is established between the operations of matter and the
abstract mental processes of the most abstract branch of
mathematical science. A new, a vast and a powerful language is developed
for the future use of analysis, in which to wield its truths so that these
may become of more speedy and accurate practical application for the
purposes of mankind than the means hitherto in our possession have
rendered possible. Thus not only the mental and the material, but the
theoretical and the practical in the mathematical world, are brought into
intimate connexion with each other. We are not aware of its being on
record that anything partaking of the nature of what is so well designated
the Analytical Engine has been hitherto proposed, or even thought
of, as a practical possibility, any more than the idea of a thinking or a
reasoning machine.
As a Mathematician, Ada was excited about the possibility of automating
laborious calculations. But she was far more interested in the principles
underlying the programming of these devices. Had she not died so young, it
is possible that Ada could have advanced the nineteenth-century state of the
art to the threshold of true computation.
[ Top ]
Even thought the Engine was yet to be built, Ada experimented with
writing sequences of instructions. She noted the value of several particular
tricks in this new art, tricks that are still essential to modern computer
languages--subroutines, loops and jumps. If your object is to
weave a complex calculation out of subcalculations, some of which may be
repeated many times, it is tedious to rewrite a sequence of a dozen or a
hundred instructions over and over, Why not just store copies of often-used
calculations, or subroutines, in a "library" of procedures for later use?
Then your program can "call" for the subroutine from the library
automatically, when your calculation requires it. Such libraries of
subprocedures are now a part of virtually every high-level programming
language.
Analytical Engines and digital computers are very good at doing things
over and over many times, very quickly. By inventing an instruction that
backs up the card-reading device to a specified previous card, so that the
sequence of instructions can be executed a number of times,
Ada created the loop--perhaps the most fundamental
procedure in every contemporary programming language.
It was the conditional jump that brought Ada's gifts as a logician into
play. She came up with yet another instruction for manipulating the
card-reader, but instead of backing up and repeating a sequence of cards,
this instruction enabled the card-reader to jump to another card in any part
of the sequence, if a specific
condition was satisfied. The addition of that little "if" to the formerly
purely arithmetic list of operations meant that the program could do more
than calculate. In a primitive but potentially meaningful way, the Engine
could now make decisions.
She also noted that machines might someday be built with capabilities far
beyond those possible with Victorian era technology, and speculated about
the possibility of whether such machines could ever achieve intelligence.
Her argument against artificial intelligence, set forth in her
"Observations," was immortalized almost a century later by another software
prophet, Alan Turing, who dubbed her line of argument "Lady Lovelace's
Objection." It is an opinion that is still frequently heard in debates about
machine intelligence: "The Analytical Engine," Ada wrote, "has no
pretensions whatever to originate anything. It can do whatever we know how
to order it to perform."
It is not known how and when Ada became
involved in her clandestine and disastrous gambling ventures. No
evidence has ever been produced that Babbage had anything to do with
introducing Ada to what was to be her lifelong secret vice. For a time, Lord
Lovelace shared Ada's obsession, but after incurring significant losses he
stopped. She continued, clandestinely.
Babbage became deeply involved in Ada's gambling toward the end of her
life. For her part, Ada helped Babbage in more than one scheme to raise
money to construct the Analytical Engine. It was a curious mixture of vice,
high intellectual adventure, and bizarre entrepreneurship. They built a
tic-tac-toe machine, but gave up on it as a moneymaking venture when an
adviser assured them that P. T. Barnum's General Tom Thumb had sewn up the
market for traveling novelties. Ironically, although Babbage's game-playing
machines were commercial failures, his theoretical approach created a
foundation for the future science of game theory, scooping even that
twentieth-century genius John von Neumann by about a hundred years.
[ Top ]
It was Charley and Ada's attempt to develop an infallible system for
betting on the ponies that brought Ada to the sorry pass of twice pawning
her husband's family jewels, without his knowledge, to pay off blackmailing
bookies. At one point, Ada and Babbage--never one to turn down a crazy
scheme--used the existing small scale working model of the Difference Engine
to perform the calculations required by their complex handicapping scheme.
The calculations were based on sound approaches to the theory of
handicapping, but as the artificial intelligentsia were to learn over a
century later, even the best modeling programs have trouble handling truly
complex systems. They lost big. To make matters worse, when she compounded
her losses Ada had to turn to her mother, who was not the most forgiving
soul, to borrow the money to redeem the Lovelace jewels before her husband
could learn of their absence.
Ada died of cancer at the age of thirty-six. Babbage outlived her by
decades, but without Ada's advice, support, and sometimes stern guidance, he
was not able to complete his long-dreamed-of Analytical Engine. Because the
toolmaking art of his day was not up to the tolerance demanded by his
designs, Babbage pioneered the use of diamond-tipped tools in
precision-lathing. In order to systematize the production of components for
his Engine, he devised methods to mass-manufacture interchangeable parts and
wrote a classic treatise on what has since become known as "mass
production."
Babbage wrote books of varying degrees of coherence, made breakthroughs
in some sciences and failed in others, gave brilliant and renowned dinner
parties with guests like Charles Darwin, and seems to have ended up totally
embittered. Bowden noted that "Shortly before Babbage died he told a friend
that he could not remember a single completely happy day in his life: 'He
spoke as if he hated mankind in general, Englishmen in particular, and the
English Government and Organ Grinders most of all.'"
While Ada Lovelace has been unofficially known to the inner circles of
programmers since the 1950s, when card-punched batch-processing was not
altogether different from Ada's kind of programming, she has been relatively
unknown outside those circles until recently. In the 1970s, the U.S.
Department of defense officially named its "superlanguage" after her.
George Boole
Although it came too late to assist in the original design of
the Analytical Engine, yet another discovery that was to later become
essential to the construction of computers was made by a contemporary of
Babbage and Lovelace. The creation of an algebra of symbolic logic was the
work of another mathematical prodigy and British individualist, but one who
worked and lived in a different world, far away from the parlors of
upper-class London.
A seventeen-year-old Englishman by the name of George Boole was struck by an
astonishing revelation while walking across a meadow one day in 1832. The
idea came so suddenly, and made such a deep impact on his life, that it led
Boole to make pioneering if obscure speculations about a heretofore
unsuspected human facility that he called "the unconscious." Boole's
contribution to human knowledge was not to be in the field of psychology,
however, but in a field of his own devising. As Bertrand Russell remarked
seventy years later, Boole invented pure
mathematics.
Although he had only recently begun to study mathematics, the teenage
George Boole suddenly saw a way to capture some of the power of human reason
in the form of an algebra. And Boole's equations actually worked when they
were applied to logical problems. But there was a problem, and it wasn't in
Boole's concept. The problem, at the time, was that nobody cared. Partly
because he was from the wrong social class, and partly because most
mathematicians of his time knew very little about logic, Boole's eventual
articulation of this insight didn't cause much commotion when he published
it. His revelation was largely ignored for generations after his death.
When the different parts of computer technology converged unexpectedly a
hundred years later, electrical engineers needed mathematical tools to make
sense of the complicated machinery they were inventing. The networks of
switches they created were electrical circuits whose behavior could be
described and predicted by precise equations. Because patterns of electrical
pulses were now used to enclose logical operations like "and," "or," and the
all important "if," as well as the calculator's usual fare of "plus,"
"minus," "multiply," and "divide," there arose a need for equations to
describe the logical properties of computer circuits.
[ Top ]
Ideally, the same set of mathematical tools would work for both
electrical and logical operations. The problem of the late 1930s was that
nobody knew of any mathematical operations that had the power to describe
both logical and electrical networks. Then the right kind of mind looked in
the right place. An exceptionably astute graduate student at MIT named
Claude Shannon, who later invented information theory, found Boole's algebra
to be exactly what the engineers were looking for.
Without Boole, a poverty-stricken, self-taught mathematics teacher who
was born the same year as Ada, the critical link between logic and
mathematics might never have been accomplished. While the Analytical Engine
was an inspiring attempt, it had remarkably little effect on the later
thinkers who created modern computers. Without Boolean algebra, however,
however, computer technology might never have progressed to the electronic
speeds where truly interesting computation becomes possible.
Boole was right about the importance of his vision, although he wouldn't
have known what to do with a vacuum tube or a switching circuit if he saw
one. Unlike Babbage, Boole was not an engineer. What Boole discovered in
that meadow and worked out on paper two decades later was destined to become
the mathematical linchpin that coupled the
logical abstractions of software with the physical operations of electronic
machines.
Between them, Babbage's and Boole's inspirations can be said to
characterize the two different kinds of
motivation that caused imaginatives over the centuries to try and
eventually to succeed in building a computer. On the one side are scientists
and engineers, who would always yearn for a device to take care of tedious
computations for them, freeing their thoughts for the pursuit of more
interesting questions. On the other side is the more abstract desire of the
mathematical mind to capture the essence of human reason in a set of
symbols.
Ada, who immediately understood Babbage's models when she saw them, and
who was tutored by De Morgan, the one man in the world best equipped to
understand Boole, was the first person to speculate at any length about the
operations of machines capable of performing logical as well as numerical
operations. Boole's work was not published until after Lady Lovelace died.
Had Ada lived but a few years longer, her powerful intuitive grasp of the
principles of programming would have been immeasurably enhanced by the use
of Boolean algebra.
[ Top ]
Babbage and Lovelace were British aristocrats during the height of the
Empire. Despite the derision heaped on Babbage in some quarters for his
often-peculiar public behavior, he counted the Duke of Wellington, Charles
Dickens, and Prince Albert among his friends. Ada had access to the best
tutors, the finest laboratory equipment, and the latest books. They were
both granted the leisure to develop their ideas and the privilege of making
fools of themselves of the Royal Society, if they desired.
Boole was the son of a petty shopkeeper, which wasn't the best route to a
good scientific education. At the age of sixteen, his family's precarious
financial situation obliged Boole to secure modest employment as a
schoolteacher. Faced with the task of teaching his students something about
mathematics, and by now thoroughly Lincolnesque in his self-educating
skills, Boole set out to learn mathematics. He soon learned that it was the
most cost-effective intellectual endeavor for a man of his means, requiring
no laboratory equipment and a fairly small number of basic books. At
seventeen he experienced the inspiration that was to result in his later
work, but he had much to learn about both mathematics and logic before he
was capable of presenting his discovery to the world.
At the age of twenty he discovered something that the greatest
mathematicians of his time had missed--an algebraic theory of invariance
that was to become an indispensable tool for Einstein when he formulated the
theory of relativity. In 1849, after his long years as an elementary-school
teacher, Boole's mathematical publications brought him an appointment as
professor of mathematics at Queen's College,
Cork, Ireland. Five years later, he published An investigation of the
laws of thought, on which are founded the Mathematical Theories of Logic and
Probabilities.
Formal logic had been around since the time of the Greeks, most widely
known in the syllogistic form perfected by Aristotle, the simplified version
of which most people learn no more than: "All men are mortal. Socrates is a
man. Therefore Socrates is mortal." After thousands of years in the same
form, Aristotelian logic seemed doomed to remain on the outer boundaries of
the metaphysical, never to break through into the more concretely specified
realm of the mathematical, because it was still just a matter of
words. The next level of symbolic precision was missing.
For over a thousand years, the only logic-based system that was
expressible in symbols rigorous and precise enough to be called
"mathematical" had been the geometry set down by Euclid. Just as Euclid set
down the basic statements and rules of geometry in axioms and theorems about
spatial figures, Boole set down the basics of logic in algebraic symbols.
This was no minor ambition. While knowledge of geometry is a widely useful
tool for getting around the world, Boole was
convinced that logic was the key to human reason itself. He knew that
he had found what every metaphysician from Aristotle to Descartes had
overlooked. In his first chapter, Boole wrote:
1. The design of the following treatise is to investigate the
fundamental laws of those operations of the mind by which reasoning is
performed; to give expression to them in the symbolic language of a
Calculus, and upon this foundation to establish a science of Logic and
construct its method . . . to collect from the various elements of truth
brought to view in the course of these inquiries some probable imitations
concerning the nature and constitution of the human mind. . . .
2. . . . To enable us to deduce correct inferences from given premises
is not the only object of logic . . . these studies have also an interest
of another kind, derived from the light which they shed on the
intellectual powers. They instruct us concerning the mode in which
language and number serve as instrumental aids to the process of
reasoning; they reveal to some degree the connexion between different
powers of our common intellect; they set before us . . . the essential
standards of truth and correctness--standards not derived from without,
but deeply founded in the constitution of the human faculties . . . To
unfold the secret laws and relations of those high faculties of thought by
which all beyond the merely perceptive knowledge of the world and of
ourselves is attained or matured, is an object which does not stand in
need of commendation to a rational mind.
Although his discovery had profound consequences in both pure mathematics
and electrical engineering, the most important elements of Boole's algebra
of logic were simple in principle. He used the algebra everybody learns in
school as a starting point, made several small but significant exceptions to
the standard rules of algebraic combination, and used his special version to
precisely express the syllogisms of classical logic.
[ Top ]
The concept Boole used to connect the two heretofore different thinking
tools of logic and calculation was the idea of a mathematical system in
which there were only two quantities, which he called "the Universe" and
"Nothing" and denoted by the signs 1 and 0.
Although he didn't know it at the time, Boole had
invented a two-state system for quantifying logic that also happened to be a
perfect method for analyzing the logic of two-state physical devices like
electrical relays or vacuum tubes.
By using the symbols and operations specified, logical propositions could
be reduced to equations, and the syllogistic conclusions could be computed
according to ordinary algebraic rules. By applying purely mathematical
operations, anyone who knew Boolean algebra could discover any conclusion
that was logically contained in any set of specified premises.
Because syllogistic logic so closely resembles the thought processes of
human reasoning, Boole was convinced that his algebra not only demonstrated
a valid equivalence between mathematics and logic, but also represented a
mathematical systemization of human thought. Since Boole's time, science has
learned that the human instrument of reason is far more complicated,
ambiguous, unpredictable, and powerful that the tools of formal logic. But
mathematicians have found that Boole's mathematical logic is much more
important to the foundation of their enterprise than they first suspected.
And the inventors of the first computers learned that a simple system with
only two values can weave very sophisticated computations indeed.
The construction of a theoretical bridge between mathematics and logic
had been gloriously begun, but was far from completed by Boole's work. It
remained for later minds to discover that although it is probably not true
that the human mind resembles a machine, there is still great power to be
gained by thinking about machines that resemble the operations of the mind.
Nineteenth-century technology simply wasn't precise enough, fast enough,
or powerful enough for ideas like those of Babbage, Lovelace, and Boole to
become practicalities. The basic science and the industrial capabilities
needed for making several of the most important components of modern
computers simply didn't exist. There were still important problems that
would have to be solved by the inventors rather than the theorists.
[ Top ]
The next important development in the history of computation, and the
last important contribution of the nineteenth century, had nothing to do
with calculating tables of logarithms or devising laws of thought. The next
thinker to advance the state of the art was
Herman Hollerith, a nineteen-year-old employee of the United
States Census Office. His role would have no effect on the important
theoretical foundations of computing. Ultimately, his invention became
obsolete. But his small innovation eventually grew into the industry that
later came to dominate the commercial use of computer technology.
Hollerith made the first important American contribution to the evolution
of computation when his superior at the Census Office set him on a scheme
for automating the collection and tabulation of data. On his superior's
suggestion, he worked out a system that used
cards with holes punched in them to feed information into an electrical
counting system.
The 1890 census was the point in history where the processing of
data as well as the calculation of mathematical equations became the
object of automation. As it turned out, Hollerith was neither a
mathematician nor a logician, but a data processor. He was grappling, not
with numerical calculation, but with the complexity of collecting, sorting,
storing, and retrieving a large number of small items in a collection of
information. Hollerith and his colleagues were unwitting forerunners of
twentieth-century information workers, because their task had to do with
finding a mechanical method to keep track of what their organization knew.
Hollerith was introduced to the task by his superior, John Shaw Billings,
who had been worrying about the rising tide of information since 1870, when
he was hired by the Census Office to develop new ways to handle large
amounts of information. Since he was in charge of the collection and
tabulation of data for the 1880 and 1890 census, Billings was acutely aware
that the growing population of the nation was straining the ability of the
government to conduct the constitutionally mandated survey every ten years.
In the foreseeable future, the flood of information to be counted and sorted
would take fifteen or twenty years to tabulate!
Like the stories about the origins of other components of computers,
there is some controversy about the exact accreditation for the invention of
the punched-card system. One account by a man named Willcox, who worked with
both Billings and Hollerith in the census office stated:
While the returns of the Tenth (1881) Census were being tabulated at
Washington, Billings was walking with a companion through the office in
which hundreds of clerks were engaged in laboriously transferring items of
information from the schedules to the record sheets by the slow and
heartbreaking method of hand tallying. As they were watching the clerks he
said to his companion, 'There ought to be some mechanical way of doing
this job, on the principle of the Jacquard loom, whereby holes in a card
can regulate the pattern to be woven.' The seed fell on good ground. His
companion was a talented young engineer in the office who first convinced
himself that the idea was practicable and then that Billings had no desire
to claim or use it.
The "talented young engineer," of course, was Hollerith, who wrote this
version in 1919:
One Sunday evening at Dr. Billings' tea table, he had said to me that
there ought to be a machine for doing the purely mechanical work of
tabulating population and similar statistics. We talked the matter over
and I remember . . . he thought of using cards with the description of the
individual shown by notches punched in the edge of the card. . . .After
studying the problem I went back to Dr. Billings and said that I thought I
could work out a solution for the problem and asked him if he would go in
with me. The Doctor said that he was not interested any further than to
see some solution of the problem worked out.
[ Top ]
The system Hollerith put together used holes punched in designated
locations on cardboard cards to represent the demographic characteristics of
each person interviewed. Like Jacquard's and Babbage's cards, and the
"player pianos" then in vogue, the holes in Hollerith's cards were meant to
allow the passage of mechanical components. Hollerith used an
electromechanical counter in which copper brushes closed certain electrical
circuits if a hole was encountered, and did not close a circuit if a hole
was not present.
An electrically activated mechanism increased the running count in each
category by one unit every time the circuit for that category was closed. By
adding sorting devices that distributed cards into various bins, according
to the patterns of holes and the kind of tabulation desired,
Hollerith not only created the ability to keep up with
large amounts of data, but created the ability to ask new and more
complicated questions about the data. The new system was in place in
time for the 1890 census.
Hollerith obtained a patent on the system that he had invented just in
time to save the nation from drowning in its own statistics. In 1882-83, he
was an instructor in mechanical engineering at the Massachusetts Institute
of Technology, establishing the earliest link between that institution and
the development of computer science and technology. In 1896, Hollerith set
up the "Tabulating Machine Company" to manufacture both the cards and the
card-reading machines. In 1900, Hollerith rented his equipment to the Census
Bureau for the Twelfth Census.
Some years later, Hollerith's Tabulating
Machine had become an institution known as "International Business
Machines," run by a fellow named Thomas Watson, Senior. But there
were two World Wars ahead, and several more thinkers--the most extraordinary
of them all--still to come before a manufacturer of tabulating machines and
punch cards would have anything to do with true computers. The modern-day
concerns of this company--selling machines to keep track of the information
that goes along with doing business--would have to wait for some deadly
serious business to be transacted.
The War Department, not the Census Office or a business machine company,
was the mother of the digital computer, and the midwives were many--from
Alan Turing's British team who needed a special kind of computing device to
crack the German code, to John von Neumann's mathematicians at Los Alamos
who were faced with the almost insurmountable calculations involved in
making the atomic bomb, to Norbert Weiner's researchers who were inventing
better and faster ways to aim antiaircraft fire, to the project of the Army
Ballistic Research Laboratory that produced the Electronic Numerical
Integrator and Calculator (ENIAC).
It would be foolish to speculate about what computers might become in the
near future without realizing where they originated in the recent past. The
historical record is clear and indisputable on this point: ballistics begat
cybernetics. ENIAC, the first electronic digital computer, was originally
built in order to calculate ballistic firing tables. When ENIAC's inventors
later designed the first miniature computer, it was the BINAC, a device
small enough to fit in the nose cone of an ICBM and smart enough to navigate
by the position of the stars.
Although the first electronic digital computer was constructed in order
to produce more accurate weapons, the technology would not have been
possible without at least one important theoretical breakthrough that had
nothing to do with ballistics or bombs. The
theoretical origins of computation are to be found, not in the search for
more efficient weaponry, but in the quest for more powerful and elegant
symbol systems.
The first modern computer was not a machine. It wasn't even a blueprint.
The digital computer was conceived as a symbol
system--the first automatic symbol system --not as a tool or a
weapon. And the person who invented it was not concerned with ballistics or
calculation, but with the nature of thought and the nature of machines.
[ Top ]
|