This post is part of an ongoing series annotating my book They Create Worlds: The Story of the People and Companies That Shaped the Video Game Industry, Vol. I. It expands on material found in Chapter 1 on pages 1-7. It is not necessary to have read the book to comprehend and appreciate the post.
One of the goals of They Create Worlds is to place advancements in video games in the proper context of the technological and economic situations of their times. To help meet this goal, the book occasionally delves into general computer history. This is especially true in the early chapters of the book, for I felt it important to demonstrate why Spacewar!, the game that started us down the path towards a new pastime and entertainment industry, could not have been born and spread until the early 1960s. Still, the book is a history of video games, not a history of computers, so I had to be careful not to get too sidetracked by this more general history, fascinating though it may be. Therefore, this annotation will give a little more context to early computer history.
What we call a “computer” today bears only a small resemblance to what the pioneers in this field would have considered a computing device. As with most modern appliances, the computer was created to automate a task already being performed through manual labor, in this case the work performed by people known as computers, a moniker that has existed since at least 1646. These days, if one is aware of human computers, they most likely associate them with the book and movie Hidden Figures, which traced the exploits of several female African American computers who contributed to NASA’s mission to put a man on the moon. While a great story, Hidden Figures does perhaps unduly glamorize this profession and places too much emphasis on individual ability when generally a human computer was just a small cog in a much larger machine.
The computer job developed in tandem with the scientific and industrial revolutions that spread throughout Western Europe between the sixteenth and nineteenth centuries. Its rise is most closely associated with the logarithm, a mathematical concept first espoused by Scottish mathematician John Napier in 1614. The introduction of logarithms essentially allowed complex multiplication and division of incredibly large numbers to be accomplished through the addition and subtraction of small numbers instead, thus greatly reducing the time needed to complete these calculations and allowing mathematical modeling of a whole range of phenomena that had previously been too complex for existing algebraic or trigometric functions. This led to advances in numerous fields, including not just pure mathematics, but also navigation, astronomy, and surveying.
Divining the log of a number quickly in a time before calculating machines required the use of mathematical look-up tables. Mathematical tables had existed for centuries before the introduction of the logarithm, with the first trigometric tables dating back to ancient Greece, but the use of logarithms allowed for more precise tables than previous efforts and turned the production of tables into a literal cottage industry. One example is the Nautical Almanac, a book of navigational tables commissioned by the British government in 1766 and updated annually ever since. The creator of the Almanac, Nevil Maskelyne, employed retired clerks and clergymen across the country to complete the tables in their own homes.
Because creating tables involved performing computations, the individuals doing these calculations were called computers. These were not mathematicians boldly coming up with new formulas or proofs, scientists discovering how the universe works, or even engineers using mathematical and scientific principles to solve real-world problems. They were more akin to bank clerks hunched over a ledger entering figures while preforming basic arithmetic. They had to be somewhat quick of mind, but required only a small amount of specialized training. Working in computing was really little different from, say, spinning cloth save that it required mental rather than manual dexterity. Therefore, just as cottage industry in the textile industry was displaced by factories during the Industrial Revolution, so too was it ultimately displaced in computing.

The first noteworthy computer factory was created in France in 1791 by Gaspard de Prony, who was engaged by the French National Assembly to create an extensive set of mathematical charts to aid in a new national survey of all the land in the country. A mathematician who believed the division of labor principles of Adam Smith’s Wealth of Nations could be applied to mathematical table creation, de Prony gathered his computers in a single building and divided the labor into three parts. At the top were a handful of brilliant mathematicians who specified the formulas and basic parameters of the tables. Next, were a group of lesser mathematicians who performed the most important calculations, started each table, and instructed the third tier of employees, the computers, how to proceed. These computers then completed the tables using nothing more than rudimentary addition and subtraction.
While de Prony’s computers were drawn from all walks of life, many were either unskilled laborers or professionals in fields that did not involve math. For example, many were hairdressers left unemployed as the aristocratic heads upon which they practiced their trade were lopped off by Madam Guillotine at an ever increasing rate. As the work progressed, de Prony discovered there was no correlation between level of intelligence or education and accuracy and that computing had therefore truly been reduced to a basic unskilled task. This does not mean all computers henceforth were unskilled laborers. As the problems tackled by computers became more complex in the 20th Century, more and more education was required to do the job well. Indeed, Katherine Johnson, one of the principle protagonists of the aforementioned Hidden Figures, was a college-educated math prodigy who’s early career arc would have probably been very different if she had been a Caucasian man instead of an African American woman. Still, most computers were not Katherine Johnson, and the job was generally considered akin to low-level clerical or secretarial work at best.
The first machines that we would today call computers were conceived as part of an effort to automate table making in the same way the spinning mule automated thread production. Indeed, Charles Babbage, the mathematician and gentleman scholar who envisioned the first such machine, was well acquainted with de Prony’s factory. Babbage felt that England had fallen behind mainland Europe in mathematical sophistication and visited Paris multiple times beginning in 1819 to consult with esteemed members of the French Scientific Academy. He also oversaw an astronomical table project in his native country beginning in 1820 that operated in much the same manner as Maskelyne’s Nautical Almanac through the use of freelance computers in a cottage industry model. Frustrated with the process, he resolved to build a machine that would automate the table making and printing process based on the principles of de Prony’s factory. Once he completed a design on paper for this “Difference Engine,” he started work on a more general-purpose mechanical calculating machine called the “Analytical Engine” that could perform all four basic arithmetic functions and store numbers in a form of memory during the process. Though never built, this Analytical Engine idea would influence early computer designs in the mid twentieth century.
Charles Babbage was unable to build any of his computing devices both due to a lack of funding and due to a lack of mechanical parts with sufficient precision to perform the processes he required of them. This latter problem ceased to exist by the late 19th Century due to advances in manufacturing technology. This led to a new category of machines that we would call analog computers today, though at the time the term “computer” was usually only applied to a person rather than a machine. These analog computers evolved in parallel with human computers rather than replacing them and were used to simulate real-world phenomena when mathematical equations and models were not sufficient for the task at hand. As stated in my book, perhaps the most celebrated analog computer of the nineteenth century was Lord Kelvin’s Tide Predictor, which used a system of levers, pulleys, and gears to simulate tidal forces and allowed for the creation of far more accurate tide tables for seaports around the world than previous methods.

By the beginning of the 20th Century, the computer profession was expanding as differential equations were increasingly used to generate mathematical models of phenomena that could not be easily observed with the naked eye such as electromagnetic wave transmission and atomic structure. As these equations became longer and more complex, armies of human computers ran numbers through these formulas to aid in a variety of scientific and engineering fields. Analog computing continued to develop in tandem, for when an equation proved too difficult or too complex to solve through computing, a physical model of the phenomenon being studied would be created instead. That way, even if the math behind a physical process was not fully understood, it could still be simulated and measured to solve a variety of practical problems.
As touched on briefly in the book, by the early 20th century, analog computing had become an indispensable tool in building power grids. Power transmission was a particular area where the differential equations were so complex that it was easier to simulate processes physically over a small area and scale up the results rather than solve the equations mathematically. This is how computing machines became inextricably linked with university electrical engineering departments in the United States before the establishment of computer science as a separate discipline rather than with math or physics departments. The vast geographic distances in the American West required solving a plethora of problems in transmitting electricity that were not being encountered anywhere else in the world at the time, and analog computers provided many of the solutions to these problems.
Electrical engineer Vannevar Bush was perhaps the first person to steer analog computing more directly into the realm of the human computer when he completed the first practical differential analyser, so-called because it uses mechanical parts to solve differential equations via integration. The theory behind the machine was developed in 1876 by Lord Kelvin’s younger brother, James Thompson, but he proved unable to build a working model. Many of his principles were incorporated into his brother’s Tide Predictor as well as several other analog computing devices, but these remained special-purpose machines tuned for specific tasks. Bush, an electrical engineer working on power transmission problems at MIT, built a device called a product integraph in 1924 that simplified the solving and plotting of first-order differential equations and then expanded it between 1928 and 1931 in conjunction with Harold Hazen, who suggested the device could be improved to solve second-order equations as well. The resulting differential analyser was a general-purpose device that could solve a wide variety of differential equations. Before long, engineers in other parts of the world had constructed their own differential analysers, setting in motion the eventual replacement of human computers with machines.

Most of the early digital computer projects, in which mathematical modelling completely displaced physical simulation, were started to solve differential equations. This included Howard Aikens’s Harvard Mark I, John Atanasoff and Clifford Berry’s unfinished prototype later dubbed the Atanasoff-Berry Computer, or ABC, and the ENIAC at the University of Pennsylvania. Indeed, ENIAC, short for Electronic Numerical Integrator and Computer, is the device which best brings together the disparate threads of this article. The co-creator of ENIAC, John Mauchly, worked in weather prediction and yearned for a device that would allow him to solve complex equations. After viewing an electric calculator displayed by IBM at the 1939 World’s Fair, he believed he could build a similar device for his purposes and immersed himself in the study of electronics. After giving a lecture on his ambitions at the American Association for the Advancement of Science in Decmeber 1940, he connected with John Atanasoff, who invited him back to his home base at the University of Iowa and showed him the work he had done on the ABC. Soon after, Mauchly took a position at the University of Pennsylvania and bonded with a graduate student named J. Presper Eckert, who was interested in developing high-speed calculating devices using vacuum tubes.
Meanwhile, Mauchly’s wife was running a training program for human computers that were being funneled to the U.S. Army’s Ballistic Research Laboratory, which was harnessing a differential analyser and an army of human computers to create artillery tables that would allow artillery commanders at the front to compute the proper trajectory for their guns to hit targets miles away without the need to solve complex trigometric equations on the fly. Just as Charles Babbage saw a need to automate the creation of astronomical tables in the 1820s, Mauchly envisioned speeding up the complicated and time-consuming process of producing artillery tables using an electronic computer that worked in a similar manner to the differential analyser only much faster. Though completed too late to be of much use in World War II, ENIAC, reduced the time needed to generate an artillery table to just 30 seconds as compared to 15 minutes for a differential analyser or 20 hours for a team of human computers. While ENIAC did not exert much influence on future computer design, it was the first completed general-purpose electronic computer that was publicly revealed, thereby playing an outsized role in stimulating further computer research at other institutions.
Human computers reached their apex in the 1940s as scientific advances played a key role in ending World War II. Because so many able-bodied men were serving as soldiers, the field came to be dominated by women. Furthermore, since educated women had few options for careers at the time, this also meant that a greater percentage of computers had advanced degrees in mathematics than in the time of Gaspard de Prony. Computers continued to play an important role in performing calculations into the 1960s, as Hidden Figures plainly attests, but as electronic computers continued to become both more sophisticated and cheaper, the occupation eventually faded away. Analog computers continued to play their part into the 1960s as well, and even into the 1980s in specific fields, but were likewise eventually obsoleted by their digital brethren. Electronic computers, meanwhile, continued to expand beyond mere calculating devices to encompass a wide variety of tasks. Much of this was due to the work of artificial intelligence pioneers like Claude Shannon and Alan Turing, which is where early computer history and early video game history intersect through the tic-tac-toe, nim, checkers, and chess programs of the 1940s and 1950s that are the subject of the first two chapters of the book.
They Create Worlds: The Story of the People and Companies That Shaped the Video Game Industry, Vol. I 1971-1982 is available in print or electronically direct from the publisher, CRC Press, as well as through Amazon and other major online retailers.