The 2,132nd Meeting of the Society

May 11, 2001

The 70th Joseph Henry Lecture

The Human Computer and the Birth of the Information Age

David A. Grier

Associate Professor, George Washington University

About the Lecture

Before Palm Pilots, PCs and mainframes, human computers did the calculation that is now done by electronic machines. These workers were not calculating geniuses nor idiot savants but knowledgeable people who, in other times and other circumstances, would have become scientists in their own right. They first appeared with the return of Halley's comet in 1758 and represent the impact of commercial methods on the scientific laboratory.

About the Speaker

David Alan Grier is the director of the University Honors Program at the George Washington University and is an associate professor of computer science. He graduated from Middlebury College as a mathematics major and completed his PhD at the University of Washington in statistical computation. He became interested in human computers when he discovered that his grandmother had been trained to be one during the first world war. She was a mathematics major at the University of Michigan, class of 1921.

Minutes

[This text was prepared from the published version of the talk that appeared as Human Computers: the first pioneers of the information age, in ENDEAVOR, vol. 25, no. 1, (March 2001), pp. 28–32.] Before computers were machines, they were people. They were men and women, young and old, well educated and common. They were the workers who convinced scientists that large-scale calculation had value. Long before Presper Eckert and John Mauchly built the ENIAC at the Moore School of Electronics or Maurice Wilkes designed the EDSAC for Manchester University, human computers had created the discipline of computation. They developed numerical methodologies and proved them on practical problems. These human computers were not savants, or calculating geniuses. Some knew little more than basic arithmetic. A few were near equals of the scientists they served and in a different time or place, they might have become practicing scientists had they not been barred from a scientific career by their class, their education, their gender or their ethnicity. The era of the human computer begins with the invention of calculus in the late eighteenth century and reaches its apogee during the Second World War. It was an era shaped by new developments in mathematics, especially by the methods of analysis and linear algebra. Yet, it was also shaped by the methods of manufacture and commerce. Human computers borrowed the methods of the office and the factory in order to attack ever-larger problems. A single human computer, no matter how talented, could not easily compete the complete orbit of a comet, the adjustments to a large survey or the full trajectory of an anti-aircraft shell. However, a room of computers could do such work, provided the task was appropriately prepared. Over two centuries, human computers learned how to divide their labors, how to work with hierarchical management and how to devise standard computing procedures. Ultimately, the history of the human computer traces not only the rise of mathematics in modern science but also the introduction of the methods of manufacture into the laboratory. Dividing the Labor: Computing the Return of Halley's Comet The story of the human computer begins with Halley's comet. Edmund Halley (1656–1742) was the editor of Newton's PRINCIPIA and an early champion of calculus, but his cometary research led him to problems that exceeded his ability to solve. When he attempted to compute the orbit of the comet that would eventually bear his name, he realized that this orbit was influenced by the mutual interaction of the Sun, Saturn, and Jupiter. He struggled for many years to find a simple mathematical expression for this interaction but ultimately failed. Having only a crude approximation to the comet's orbit, he consigned the problem to the next generation of scientists. “Having touched upon these things,” he wrote in the final edition of his SYNOPSIS OF THE ASTRONOMY OF COMETS, “I shall leave them to be discussed by the care of posterity, after the truth is found out be the event.” Halley's posterity proved to be Alexis-Claude Clairaut (1713–1765). Clairaut created a new mathematical model for the orbit of Comet Halley but it was a model that could only be solved numerically. In the summer of 1758, he recruited two friends to undertake the calculations, the astronomer Joseph Jerome Lalande (1732–1807) and Reine Lepaute (1723–1788), the wife of a clockmaker. The three computed together at a table in the Luxembourg Palace for nearly five months, mathematically tracing the comet in its orbit. They completed their work in early November 1757 and Clairaut announced that the comet would reach its perihelion, the point of its orbit closest to the sun, on April 13, 1759. Not all scientists were comfortable with complex calculations and at least one, mathematician Jean le Rond d'Alembert (1717–83), decried the “spirit of calculation”. He argued that computation was not a proper substitute for careful analysis and that Clairaut's work was “more laborious than deep”. When Clairaut's prediction missed the true perihelion by thirty-one days, d'Alembert was quick to claim that calculation added nothing to the understanding of comets. Ultimately, few shared d'Alembert's doubts and soon others were organizing computing groups. Human computers soon discovered the benefits of dividing the task and specializing their skills. The greatest advocate of specialization was the English economist Adam Smith (1723–1790). Smith argued in THE WEALTH OF NATIONS that the division of labor produced “greatest improvement in productive powers of labor.” A French civil engineer, Gaspard de Prony (1755–1839), borrowed Smith's ideas to prepared nineteen volumes of trigonometric and logarithm tables for the revolutionary French government. With the assistance of a small group of mathematicians, Prony divided the computations in to a series of additions and subtractions. He then hired about eighty computers to do the arithmetic. Most of these computers had served the former aristocracy as personal servants and knew only the basic rules of arithmetic. The Computing Factories of the Nineteenth Century Even with the division of labor, Prony's computers required nearly six years to complete their calculations and few scientists could afford to support such a computation. Scientific computation had little commercial value, as the British computer Peter Barlow (1776–1862) discovered. When he published a volume of his computations, Barlow complained “the time employed in the computation, the expense of publication, and the limited number of purchases, which from the nature of the subject is to be apprehended, preclude every idea of adequate remuneration.” Human computers could reduce the cost of computation by either increasing the speed of calculation or by reducing errors in calculation. The more visionary attempted to impose mechanical control upon computation. The more mundane simply used a traditional hierarchy to direct the work. Charles Babbage (1792–1871) built directly upon Prony's work and proposed using mechanical control to direct computation. Starting with a geared mechanism that could add and subtract, he designed a machine to combine the additions and subtractions in order to interpolate a function. As the machine used a form of interpolation called the method of finite differences, Babbage called his design the Difference Engine. With this machine, he argued that computations “could be produced at a much cheaper rate and of their superior accuracy there could be no doubt.” He claimed that the Difference Engine would have reduced Prony's computing force from ninety-six to twelve. Babbage never completed a full version of his Difference Engine and most computing groups attempted to improve their work by applying rigorous hierarchical management to human computers. Those that worked for the military simply extended military hierarchy and the military chain of command. In England, the Astronomer Royal George Airy (1801–1891) was the first to exercise such tight control over his computing staff. When Airy assumed control of the Royal Observatory in 1835, it already had a hierarchy in place for procuring, testing, adjusting and distributing chronometers for the Royal Navy. Airy extended that hierarchy in order to process a backlog of astronomical observations. Two assistants directed a force of “boy computers”, young men between the age of 13 and 20. They used computing sheets to guide calculations and imposed a rigorous schedule on the boys. The structure reflected the class prejudice of the age. Only “man of respectable rank in society,” Airy wrote, could properly direct the observatory and not face the possibility “of losing authority over the subordinate assistants...” The United States was a more egalitarian society and hence computers were slower to adopt the kind of hierarchy that Airy had implemented at the Royal Observatory. However, the first large computing group, the Nautical Almanac was an office of the navy and it borrowed military command structure to control computation. The director of the almanac, Captain Charles Henry Davis (1807–1877) kept the computers under tight control. With the assistance of Harvard Professor Benjamin Pierce (1809–1880), he prepared the computing plans and he gave each computer detailed instructions about the way in which the calculations should be done. Unlike the boy computers of England, these computers were skilled mathematicians and included Maria Mitchell (1818–1889), John D. Runkle (1822–1902) and Sears Walker (1805–1853). Walker directed an observatory in Philadelphia and was considered one of the top American astronomers. Runkle was a Harvard graduate with a future. He became the founding president of Massachusetts Institute of Technology. Mitchell would become the first professor of astronomy at Vassar College. Runkle, Walker and Mitchell worked out of their homes and corresponded with Davis through the mails. Of the American computing institutions, only the Coast Survey maintained a centralized computing office. The other American computing groups, including the Naval Observatory, the Harvard Observatory and the Nautical Almanac, establish centralized computing offices during the 1870s. These computing offices appeared during an era of rapid American industrialization and came to resemble American business offices. Though Charles Henry Davis had hired Maria Mitchell as a computer, most computing labs didn't hire female computers until business began to hire women as clerks, stenographers and bookkeepers. The Harvard Observatory, under the direction of Edward Pickering (1846–1919), began hiring young women as computers in 1876. Professionalizing Human Computation The First World War required large numbers of human computers. Computers on both sides of the war produced map grids, surveying aids, navigation tables and artillery tables. With the men at war, most of these new computers were women and many were college educated. The British Army established a computing office for women at Girton College. Karl Pearson (1857–1936) a statistician at University London College volunteered the services of his computing group, which included five women. In the United States, the American army operated two large computing groups. About sixty computers worked at the Aberdeen, Maryland Proving Ground to gather and process ballistics data. In Washington, D.C., a group of ten produced range tables for the army. The first female computer hired by the Army in 1918 was Elizabeth Webb Wilson (1898–1975). Wilson was a graduate from George Washington University and had won her school's mathematics prize. She had watched the suffrage protests from her parent's house, which was little more than a block from the U. S. Capitol, and patiently sought a war job that would make use of her mathematical skill. The First World War initiated a period of rapid growth for American mathematics and for computation. One of the Aberdeen computers later observed “For many years after the first world war, the overwhelming majority of significant American mathematicians was to be found among those who had gone through the discipline of the proving ground.” While the Aberdeen computers became the leaders of American mathematics, other computers who had worked during the First World War found new opportunities for their skills. They formed a computing bureau at the U. S. Department of Agriculture to process farm data, a staff of eleven “computing girls” to do scientific calculation for the scientists at Bell Telephone Company and a cooperative of computers at Indiana University to prepare tables of mathematical functions. During the period, a handful of human computers began to professionalize the field by promoting computing, organizing its literature, founding scholarly journals and forming professional organizations. Perhaps the three most influential leaders of the era were L. J. Comrie (1893–1950), Gertrude Blanch (1898–1996) and R. C. Archibald (1875–1955). Comrie was the tireless advocate for employing computing machinery. He had learned about machine computation from Karl Pearson while recuperating from war wounds. Once he had recovered, he enrolled in the graduate astronomy program at Cambridge University where he soon became an active critic of older computing methods used by such groups as the Greenwich Observatory. After completing his studies, he was appointed deputy superintendent of the British Nautical Almanac Office, where adapted commercial accounting machines to prepare almanac computations. He left the Almanac office in 1936 and formed an independent computing lab, the Scientific Computing Service. Unlike Comrie, Blanch was more interested in the mathematics of computation than in computing machines. She had to prepare computing plans for a large computing group that operated much like Prony's computers. This group, the Mathematical Tables Project, was a work relief project funded by the Works Projects Administration (WPA). The WPA required the project to use labor-intensive methods in order to employ the greatest number of workers. Most of the computers knew little about arithmetic. A few were mentally disabled. Some thwarted the computations and a handful were more interested in unionization than in scientific computation. Blanch developed ways of organizing the group and devised mathematical methods that were self-checking, much in the same way that accounting procedures were self-checking. The group ultimately produced twenty-eight volumes of tables and attracted the attention of research scientists, such as Hans Bethe (1906–) and Philip Morse (1903–1985). Blanch started assembling the methods of computation and R. C. Archibald completed the process. Archibald edited the journal MATHEMATICAL TABLES AND OTHER AIDS TO COMPUTATION. The journal attracted a wide readership during the Second World War, when the demand for computers expanded. Computers calculated ballistics trajectories, shock wave propagation, stresses on airframes, navigation tables, efficient bombing plans, radar reflections, optimal production strategies, and likely cipher keys. In Archibald's journal, the computers found lists of tables, errata and articles on computational methods. Comrie was on the journal's editorial board and Blanch was a regular contributor. Near the end of the Second World War, MATHEMATICAL TABLES AND OTHER AIDS TO COMPUTATION began publishing reports on the new electronic computing machines. It published the first reports on the ENIAC and the early Bell Labs computers. Archibald and his editors organized the first conference on computing in November 1945. He also offered the services of his journal to the fledgling professional organization of computer science, the Association for Computing Machinery (ACM). For six years, the leaders of the ACM used MATHEMATICAL TABLES AND OTHER AIDS TO COMPUTATION as their journal of record. The End of the Era of Human Computers When the ACM started its own journal in 1952, the era of the human computer was waning fast. Most of the computing groups from the war had long been disbanded and the few that remained were steadily being replaced by electronic computers. Two of the largest human computing groups completed computing machines that summer. The computing lab at the Bureau of Standards completed the SEAC computer and the Institute for Numerical Analysis, which also employed a large computing staff, completed the SWAC. Once these machines were operational, the human computers took other jobs, such as operating the computing machines or checking computer programs by recalculating the results. A few scientists, such as MIT's Philip Morse, argued that human computers still had plenty to do in the electronic computer era and in 1954, he organized a large conference for human computers. This conference produced the final legacy of the human computer, THE HANDBOOK OF MATHEMATICAL FUNCTIONS. The HANDBOOK showed how to calculate most of the higher mathematical functions in common use. It was a top selling scientific book for nearly thirty-five years, yet it was a product of human computers and was deeply rooted in the mathematics of the 1930s. Its editors, Milton Abramowitz (1915–1958) and Irene Stegun (1919–), had served in Gertrude Blanch's Mathematical Tables Project and virtually all of the contributors to the volume had worked as human computers. The HANDBOOK summarizes only the mathematical methods of the human computers. It speaks little of their organizational ideas, the ideas that they used to coordinate and manage their collected labor. We see those ideas reflected in the organizations that manage computers and maintain digital networks. These ideas, the division of labor, the specialization of tasks, the imposition of hierarchical management and the replacement of humans by machines, were not unique to human computers but were borrowed from Adam Smith, from the military, from commercial offices and even from a work relief project. Created by David Alan Grier