When Computers Stopped Being Human

The beginning of the Information Age started inauspiciously during 1812 in England. Charles Babbage had drifted off into daydreams, having become tired of the tedious logarithms he was calculating. In the next moment, he would have the idea that would launch the Information Age. This is his story.

Sitting at his desk, he began to wonder: what could happen if you used a machine to do math instead of humans. The spark of insight would send him on a lifelong journey that led to the first mechanical computers 100 years before electronic computers. Moreover, his work would be picked up by later pioneers still within the nineteenth century who would for the first time combine a computing device with the newly-discovered Boolean logic—a momentous event that would lay the foundation of modern computing.

This is the sequel to part one that precedes Babbage, in the age when Computer was a job title held by humans.

Automatons and Almanacs Inspire Babbage

Computation wasn’t just useful for ship captains, artillerymen and bankers. Charles Babbage born in 1791 would find himself caught up in calculating everything from postal rates, to life expectancy, and even conducting an accounting of glass windows breaking. The insight to apply machines to computation may have spawned early in Babbage’s life. As a 10 year old, he was exposed to the burgeoning automaton industry. Babbage was struck by the mechanical automaton toys exhibited by John Joseph Merlin, at the time a famous curator of automatons. The memory left a mark on Babbage so much so that he bought one at auction later as an adult. Babbage also knew of the Jacquard loom, an instrument that would stir the imagination of not just him, but also later pioneers like John Von Neumann.

One evening circa 1812, Babbage found himself lost in thought while looking at a table of logarithms (exactly the kind of tables used to construct The Nautical Almanac from part 1). A colleague walked in and snapped him out of his daze by asking, “Well, Babbage, what are you dreaming about?” Babbage recalls in his autobiography pointing to the logarithms and replying, “I am thinking that all these tables might be calculated by machinery.”

This singular moment is arguably the beginning of the Information Age. Babbage, perhaps sensing this, wrote to a colleague in 1822:

“I will yet venture to predict, that a time will arrive, when the accumulating labour which arises from the arithmetical application of mathematical formulae, acting as a constantly retarding force, shall ultimately impede the useful progress of the science, unless this or some equivalent method is devised for relieving it from the overwhelming incumbrance of numerical detail.”

By this point, Babbage was well aware of the work De Prony had done earlier in France, and the great cost De Prony endured with his 100 human computers.

So what motivated Babbage and, more importantly, his supporters in the government? On the surface, one would think it was pure cost of computing. Isn’t it why we automate things, to save time and money? That might be just a part of the answer though. In many cases there was even a higher, hidden cost: the cost of an error. Think about it: a minor mistake in calculating latitude might be imperceivable on paper, but deadly in treacherous waters during a storm. Removing the “human element” from the loop cuts this gordian knot.

The First Mechanical Computer

Still in 1822, Babbage would make Difference Engine 0, a sort of “minimum viable product” that could calculate two orders of differences (e.g. x2 + x + 41). Author David Alan Grier writes, “Babbage started with a geared adding mechanism, originally developed by Blaise Pascal in 1643, improved the design, and cascaded the devices so that the results of one addition would be fed to the next.”

In 1823, Babbage secured funding for a larger mechanical computer intended to operate on 20 digit numbers and 6th order differential equations. Ada Lovelace, a colleague of Babbage and arguably the first software engineer recalled, “We both went to see the thinking machine last Monday. It raised several numbers to the 2nd and 3rd powers, and extracted the root of a Quadratic equation.” Lovelace had so many important realizations about early software design that we’ll cover her in a forthcoming edition of Buried Reads Engineering dedicated to her contributions.

Babbage would never successfully build a full scale working Difference Engine 1, and so his grant from the government went unfulfilled. He blamed the failure on the mechanist, and it leaves us wondering how history would be different if he had an eighteenth century hardware savant like Steve Wozniak as a collaborator.

Coming Back to the Problem

Still dreaming of a successful, generalized machine for computation, he designed the Difference Engine 2 by 1849. It was intended to operate on up to 31 digit numbers and 7th order equations. Babbage was beginning to think about an approach beyond just solving differential equations. In his book “The Information”, James Gleick credits Lovelace as being the superior spokesperson above Babbage for the emerging field of computer science:

The science of operations […] is a science of itself and has its own abstract truth and value; just as logic has its own peculiar truth and value, independently of the subjects to which we may apply its reasonings and processes.

Babbage would write, “It is the science of calculation—which becomes continually more necessary at each step of our progress, and which must ultimately govern the whole of the application of science to the arts of life.”

How It Worked

The engine worked by hand cranking on one side, while eight cylinders called registers about the height of a human turned against smaller gears to conduct addition. The first 7 registers were for the seven degrees you could solve, and the 8th storing the result. When viewed from the back you can see rotating hooks that look like spinning DNA helixes govern the carrying of digits across columns. The final step used a printer stereotype to print out the result, avoiding any errors from transcription.

Babbage’s work on the Difference Engine was never fully built in his lifetime. In the late 1980s his designs were resurrected and built. You can see a working Difference Engine 2 at the Computer History Museum in Mountain View, California. There is also an excellent video demonstrating its use, as a 5th order polynomial is solved in about 4 crank revolutions by its operator.

Babbage’s Legacy

During Babbage’s life the Difference Engine took physical form in a few prototypes, but was trapped mostly as a set of designs on paper. To hear it from historians like Jame Gleick, “Babbage’s engine was forgotten. It vanished from the lineage of invention.” From their accounting, we are left to believe that Babbage’s ideas would be ignored for 100 years until well into the dawn of electronic computers. Respected mathematician and computer scientist Richard Hamming teaches a similar line of thought, “The machine was never completed. It died. It was pretty well lost until we built quite a few machines and found out they were anticipated fifty, almost a hundred years, before.”

In fact, there was no “Dark Ages of computing” after Babbage’s death. His work was almost immediately picked up by George and Edvard Scheut, who inspired by Babbage, created a working difference engine in 1843. In yet another example of mechanical computers blossoming after Babbage, Charles Xavier Thomas de Colmar’s Arithmometer, which was popular and manufactured from 1851 up to 1915.

William Jevons, an English economist and logician, would fawn over Babbage in 1869, “It was reserved for the profound genius of Mr. Babbage to make the greatest advance in mechanical calculation, by embodying in a machine the principles of the calculus of differences.” Jevons proceeded to create a logic piano, combining for the first time Boolean logic with mechanical computing, a crucial leap. Jevon’s work in turn would in turn be discovered and carried on by Allan Maquand and Charles Pierce when they built the first electronic computer around 1890.

Babbage’s work would also guide twentieth century computing pioneers. He influenced computer designer Howard Aiken in the late 1930s; Aiken crucially was an influence on Von Neumann and his computing architecture, which is largely still the design by which computers are built today. We also know that Babbage’s work came up in conversation at Bletchley Park where the Colossus computer—the machine which broke the Gernman Enigma Machine—was built during World War II.

Far from a tragic failure, Babbage, along with Ada Lovelace, are heroic figures in the history of computing and software.

What’s Next

At the same time mechanical computer hardware was taking off, another transformation was underway. The invention of the telegraph was connecting the continental United States and an effort was afoot to link Europe and the U.S. with an ambitious undersea cable. A worldwide telegraphy network would get built before the close of the nineteenth century. Enter your email below to get this story when it’s published.

Credits and More Reading

As was the case in our last issue, James Gleick’s The Information was an invaluable resource. Steven Johnson is a fantastic source for understanding How We Got to Now and also the twists and turns it in How Play Made the Modern World.

Special thanks for Jason Rowley for editing this issue. Besides regularly writing for Crunchbase News, he also writes a newsletter the Rowley Report sampling great reads he comes across. Max Grigorev also helped with both research and writing of this issue.

How Computing Came About, part 1

This week, I am kicking off by sharing a story I didn’t learn about until I had worked in the software industry for 20 years: how computing came about. Many of us know the story of Bill Gates and Paul Allen creating Microsoft or Steve Jobs licensing Xerox Parc’s technology to bring GUI to the masses. Too few of us know about their distant forerunners: Nevil Maskelyne, Charles Babbage, Claude Shannon, Alan Turing, or John von Neumann.

The Original Computers

Before electronic computers, Computer was a job title held by humans.

There is a rich history of humans as computers dating back to at least Mesopotamia in 3000 BC. Donald Knuth, who is famous for The Art of Computer Programming, chronicles the story in his research paper Ancient Babylonian Algorithms. We know the Babylonians were working with algorithms, because they wrote them down on tablets. Knuth writes, “they represented each formula by a step-by-step list of rules for its evaluation. In effect, they worked with a ‘machine language’ representation of formulas instead of a symbolic language. The calculations described in Babylonian tablets are not merely the solutions to specific individual problems: they actually are general procedures for solving a whole class of problems.”

The Babylonians gave example numbers in their calculations, a lineage we can give thanks to anytime we use a REPL. The Babylonians did not explore all possible avenues; Knuth found no usage of control flow or iteration. It would take time for algorithms and computation to develop more fully.

Organizing Human Computers

Computing would have to become more crucial and more expensive before humans would be motivated enough to outsource computation as a job. The seeds were planted centuries ago.

Dating back to 1497, sailors were calculating their latitude from the North Star, but calculating longitude required delicate timekeeping devices that would be damaged on a ship rocking at sea. The British government enacted The Longitude Act of 1714, encouraging entrepreneurs and scientists to devise a means for calculating longitude.

Nevil Maskelyne, an astronomer and also an appointee to England’s Board of Longitude, wrote the volume pictured here: The British Mariner’s Guide. It laid out how you could calculate longitude by lunar distance. The problem then became knowing where the moon was on any given day. Thanks to Kepler and Newton this was largely a matter of math.

Tabulating the Heavens tells the story: “The Nautical Almanac and Astronomical Ephemeris, was to be published annually giving data relating to a particular year. It contained tables of lunar distances tabulated for every three hours of every day. It also contained other astronomical data of use to both the navigator and the astronomer.”

Maskelyne organized a network of at least 35 human computers each working from their homes on their own timetable throughout England. Computing started as a literal cottage industry. Maskelyne did not send all the equations that underpinned the almanac to each computer, but rather sent them a list of steps to extract data from several tables. A typical calculation would involve 12 table lookups and about 14 arithmetic operations on eight digit numbers. Each month’s table involved up to 1,365 cells to be calculated.

The almanac was used by several famous explorers, including Captain Cook during his voyage to New Zealand in 1768 and again in 1772.

The work of Isaac Newton, Nevil Maskelyne and Halley (not discussed here, but also a pioneering computer) was not without its critics. The famous book Gulliver’s Travels is actually a critical satire of the perceived arrogance of these computers.

The French Angle

The story of distributing computation across humans doesn’t stop with Maskelyne’s almanac. He would encounter and influence another crucial computing pioneer. That story picks up on the other side of the English Channel, in revolutionary France. In 1791, the new regime wanted to shed all memory of “old” way of doing things, and the Académie des Sciences decided to create a new system of weights and measures. The new units were all based on the metric principle: they would be related through a measure of ten: 10 centimeters in a decimeter, 10 decimeters in a meter and so on. One of the units that needed addressing was angle measurement. Under the new system, a right angle would no longer have 90 degrees. Instead, it would be split into exactly a hundred of new units, called grades.

If the new angle units were to be successful, someone had to publish trigonometric tables in book form. The Académie des Sciences wanted the work done and the job fell on the shoulders of Gaspard de Prony, the director of the Bureau of Land Management.

De Prony quickly realized that a single person wouldn’t even come close to completing the work before the deadline. Fortunately, he had met Maskelyne and was aware of his work.

Moreover, as was common among his social circle at the time, de Prony was a big fan of Adam Smith and his “On Wealth of Nations”. De Prony bragged he “could manufacture logarithms as easily as one manufactures pins”: a nod to Adam Smith’s division of labor theory.

The computation was organized in 3 levels:

The first level consisted of 5-6 high-ranking mathematicians, including such paragons as Adrien-Marie Legendre (of the Legendre polynomial and Legendre transform fame) and Lazare-Nicolas-Marguerite Carnot. This group had nothing to do with the actual computations, their job was to oversee the process and to choose the analytical approximation formulas. The resulting equations only contained additions, subtractions and multiplications. That meant that even people with rudimentary arithmetic skills could master them.

The second level had 6-8 lesser mathematicians, whom de Prony called “planners”. Planners had two jobs. First, they used the formulas, devised by the high-ranking mathematicians, to create worksheets. The sheets had exact instructions for a simple computation on one side, and a table of input values on the other, followed by the blanks to be computed and filled in. The second job, called “differencing”, involved looking at the differences between the adjacent values of any given function. As all trigonometric functions are continuous, subsequent values in the table should be close to each other. This allowed planners to check the result of a computation, without redoing it.

The third group, the actual computers, up to 90 in their number, were the ones filling the worksheets prepared for them by the planners. As the computations were trivial, anyone with a basic knowledge of arithmetic and decent mechanical skills could be a computer. In fact, most de Prony’s computers were… out-of-work hairdressers! As the result of the revolution, the demand on powder wigs had collapsed, and many well educated coiffeurs were looking for something new to do.

There is an interesting parallel here: the de Prony’s planners were basically translating mathematical equations into the instructions for “computers”. In a way, this represents the work of a modern programmer. One coder-friend jokingly remarked, ‘my job is translating research papers into computer code.’ Not unlike a planner of 1792 indeed.

Du Prony was still operating the computing enterprise until 1801. By then, saddled by high inflation, the revolutionary government did not have the funds to publish the tables. However, the work would catch the eye of Charles Babbage, most likely when he visited Paris in 1819, but also because De Prony had reached out to several English physicians who Babbage was connected to.


Read issue 2, which picks up where we left off: the story of Charles Babbage and how computation went from a job for humans to a job for mechanical computers. As time goes forward human computers fade by volume, but the intertwined nature of humans and computing lives on to this day. For example, every time you fill out a CAPTCHA you are a human computer providing gold data back to electronic computers.

If you loved the first issue of the Buried Reads Engineering Newsletter, enter your email to get future issues:


Credits and More Reading

My friend Max Grigorev was instrumental in encouraging me to work on engineering topics, and helped with the writing of this issue.

Our research was greatly aided by Jame Gleick’s The Information. If you want to learn more about the history of computation and information theory, I cannot think of a better place to start. When Computers Were Human was also heavily cited for this issue. We only covered a fraction of human computer history. This book is an incredible read if you want to explore deeper.

Special thanks to Jimmy Soni, author of A Mind at Play: How Claude Shannon Invented the Information Age, for encouraging my choice of format and pacing. I am also grateful to Maran Nelson, Fawaz Al-Matrouk, and several others for providing early feedback on drafts. Any mistakes or omissions are my fault, not theirs.

Also if you love it, please let your friends know. In later weeks, I’ll be working on original research outside the history of computation (problem solving, debugging, static analysis, and more). If you have suggestions write us at editor@buriedreads.com.