Skip to main content

Notes From the Field

Another First Computer

Spring 1999 | Volume 14 |  Issue 4

WHO BUILT THE FIRST COMPUTER ? As I. Bernard Cohen explains elsewhere in this issue, answering the question is difficult and ultimately arbitrary. Under a loose definition of what constitutes a computer, Charles Babbage’s 1820s Difference Engine can claim the title. For today’s teenagers, to whom a computer with no mouse might as well be a typewriter, Apple’s Lisa model (predecessor of the more successful Macintosh) makes a much better candidate.

One strong contender is the ABC (Atanasoff-Berry Computer), built by John V. Atanasoff and Clifford Berry between 1939 and 1942. The ABC’s supporters assert that it contained all the basic features that were used in the room-sized computers of the late 1940s. And unlike those wire-sprouting behemoths, the ABC was simple enough and small enough—about the size of a refrigerator laid on its side—that a team of engineers at Iowa State University (ISU), where the ABC was born, has recently been able to build a replica and exhibit it to the public.

 

Like most early computers, the ABC was built with a specific task in mind. At the time, ISU was still called Iowa State College of Agriculture and Mechanic Arts, and its interest in machine computation united these areas of specialization. Agricultural research requires laborious manipulation of experimental data, and this, like many other types of farm work, was fertile ground for automation. In 1923 Henry Wallace—an Iowa State graduate and a researcher in crop genetics who went on to become Vice President of the United States—got the college involved by teaching a course on the use of calculating machines to derive correlation coefficients.

The study of automated computing took root at ISU, and around 1933 Atanasoff, a professor of mathematics and physics, became interested. To free his graduate students from laborious manual calculation, he envisioned a special-purpose machine to solve systems of simultaneous linear equations with up to 29 unknowns, accurate to 15 decimal places. It would do so by successively eliminating variables, in a manner familiar to any high school algebra student.

On a single night in 1937, over two bourbon-and-waters at an Illinois roadhouse, Atanasoff came up with a series of striking insights. He decided to have his machine perform calculations electronically, with vacuum tubes instead of relays; to use binary arithmetic; and to store data on capacitors that would be recharged frequently to keep the data from leaking away. All these features became, and remain, indispensable elements of computer design (except the tubes, which have been replaced by solid-state devices).

In June 1941 John W. Mauchly, a physicist who had met Atanasoff at a convention, spent a week visiting Atanasoff’s laboratory and talking computers with him and Berry, a graduate student. Later in the decade Mauchly co-designed and built ENIAC, which is often called the world’s first electronic computer. His friendly visit with Atanasoff, like Glenn Curtiss’s 1906 call on the Wright brothers, subsequently became the subject of great acrimony. In a controversial 1973 decision, a judge invalidated Mauchly and J. Presper Eckert’s patent on the digital computer, ruling that Atanasoff (who never applied for a patent) had anticipated their main ideas. Mauchly bitterly disagreed, asserting that Atanasoff had told him nothing he didn’t already know.

By the spring of 1942 the individual pieces of the ABC’s calculating apparatus were performing flawlessly in tests. Then World War II, which would spur the development of so many other technologies, ended Atanasoff’s days as a computer pioneer. In September he joined the U.S. Naval Ordnance Laboratory to perform acoustical-testing research. He never returned to computers or to Iowa State, and at the time of his departure, he had not yet succeeded in making his computer do what it was designed for. A device that recorded intermediate results by burning holes in sheets of paper failed less than 1 time in 10,000, but that was enough to make the ABC’s output unreliable for systems of more than a few variables. The abandoned ABC was dismantled in 1947, and today all that remain are a memory drum and a few tubes and other small parts.

With so little to go on, the team that built the ABC replica had to piece together its design from old photographs, scattered documents, and fading recollections. Atanasoff himself was a useful source of advice and diagrams until his death in 1995. A 1988 book by Alice R. and Arthur W. Burks, based on extensive interviews with Atanasoff, was also quite helpful.

Old tubes and parts turned out to be readily available from specialist dealers. A harder task was figuring out exactly how the computer’s circuits and mechanical components were put together. The rebuilders borrowed the surviving memory drum—a plastic cylinder studded with 1,600 capacitors—from the Smithsonian and peeked inside it with X-ray and MRI equipment at a local hospital. But the simple question of where the ABC displayed its output stumped the investigators until someone who had worked with the machine answered an advertisement and told them about the odometerlike dial next to the card reader.

Atanasoffs supporters say his machine contained all the basic features that were used in the roomsized computers of the late 1940s.

The replica has been touring Iowa and neighboring states while the project leaders try to find a permanent home for it at ISU. Meanwhile, in England, several other historical computers have recently been replicated: a 1948 machine called Baby from the University of Manchester, World War II’s code-breaking Colossus, and even Babbage’s Difference Engine, which will soon print out its first mathematical tables a century and three-quarters after its invention. While they may not be as beautiful to look at or as much fun to operate as an antique car or locomotive, these replicas demonstrate how much effort and ingenuity lay behind the first halting steps into the information age—and how unimaginable today’s palmtops and smart cards would be to the men who built their predecessors with pliers and soldering irons.

More Work No More

IN HER 1983 BOOK MORE WORK FOR MOTHER (WHICH was adapted for a 1987 Invention & Technology article), Ruth Schwartz Cowan addressed an apparent paradox associated with technology: Despite the proliferation of laborsaving inventions, women in the 1960s spent just as much time on housework as had their counterparts in previous eras. Washing machines, refrigerators, frozen food, permanent-press fabrics, electric mixers, and many other advances had promised to turn the housewife’s lot into an endless parade of tennis games and coffee klatches interrupted only by the occasional brief pause to push a few buttons. Instead American wives found themselves more harried than ever. How could this be?

As Cowan showed, the explanations were numerous. Technology had indeed made housework less physically taxing and reduced the time needed to perform most chores. But this change had allowed many women to take jobs, leaving them with a “second shift” when they got home and thus increasing their total work time. Suburbanization, with its shift to larger living quarters, also increased the homemaker’s burden, especially since appliances had eliminated the need for most households to employ servants.

Furthermore, when technology made a given task less time-consuming, women responded by doing more of it—cleaning and laundering more often, for example, or cooking a wider variety of foods. And until recently it occurred to very few husbands that they might share their wives’ workloads. (This finding, at least, came as no surprise.)

One of Cowan’s sources was a 1965 study in which Americans recorded their use of time in diaries. The leader of that study, John P. Robinson of the University of Maryland, has continued gathering data ever since, and his latest compilation of results (written with Geoffrey Godbey), Time for Life: The Surprising Ways Americans Use Their Time (Penn State University Press, 1997), reports a reversal in the trend that Cowan wrote about. Between 1965 and 1985 the average time spent by American women on housework, shopping, and child care declined from 40 hours to 31 hours per week. Almost all the decline has come among women not employed outside the home, whose domestic workload has dropped from 52 to 39 hours while that of employed women has held fairly steady at around 26 hours. Preliminary figures from 1995 appear to support these earlier trends.

Have the timesaving features of technology finally taken hold? Not exactly. Robinson and Godbey, like Cowan, found no significant correlation between owning a piece of technology and saving time with it. Dishwasher owners, for example, spent the same number of minutes cleaning up after meals as nonowners; those who had vacuums spent as much time cleaning floors as those who did not. Social rather than technological factors seem to be the most responsible for the change: Americans are marrying later and having fewer children, and ( mirabile dictu ) husbands seem to be doing a greater share of the work around the house, though still only about half as much as their wives.

Fifteen years after Cowan’s book, then, her conclusions remain true. Technology did not by itself save time; instead it increased choices. By allowing women to take paid jobs, and by making household chores acceptable to men, the domestic industrial revolution showed that a woman’s entire life did not have to be tied up in homemaking. The accompanying liberation has paid off for everyone—men as well as women, and those who bake their own bread as well as those who heat frozen dinners in the microwave.

We hope you enjoyed this essay.

Please support America's only magazine of the history of engineering and innovation, and the volunteers that sustain it with a donation to Invention & Technology.

Donate

Stay informed - subscribe to our newsletter.
The subscriber's email address.