Tom Watson Sr.

Historical Interlude: The Birth of the Computer Part 3, the Commercialization of the Computer

In the 1940s, the electronic digital computer was a new, largely unproven machine developed in response to specific needs like the code-breaking requirements of Bletchley Park or the ballistics calculations of the Aberdeen Proving Grounds.  Once these early computers proved their worth, projects like the Manchester Mark 1, EDVAC, and EDSAC implemented a stored program concept that allowed digital computers to become useful for a wide variety of scientific and business tasks.  In the early 1950s, several for-profit corporations built on this work to introduce mass-produced computers and offered them to businesses, universities, and government organizations around the world.  As previously discussed, Ferranti in the United Kingdom introduced the first such computer by taking the Manchester Mark 1 design, increasing the speed and storage capacity of the machine, and releasing it as the Ferranti Mark 1 in February 1952.  This would be one of the few times that the United Kingdom led the way in computing over the next several decades, however, as demand remained muted among the country’s conservative businesses, allowing companies in the larger U.S. market to grow rapidly and achieve world dominance in computing.

Note: This is the third of four posts in a series of “historical interludes” summarizing the evolution of computer technology between 1830 and 1960.   The information in this post is largely drawn from Computer: A History of the Information Machine by Martin Campbell-Kelly and William Aspray, The Maverick and His Machine: Thomas Watson, Sr. and the Making of IBM by Kevin Maney, A History of Modern Computing by Paul Ceruzzi, Computers and Commerce: A Study of Technology and Management at Eckert-Mauchly Computer Company, Engineering Research Associates, and Remington Rand, 1946-1957 by Arthur Norberg, and IBM’s Early Computers by Charles Bashe, Lyle Johnson, John Palmer, and Emerson Pugh.

UNIVAC

1951_univac_large

The UNIVAC I, the first commercially available computer in the United States

For a brief period from 1943 to 1946, the Moore School in Philadelphia was the center of the computer world as John Mauchly and J. Presper Eckert developed ENIAC and initiated the EDVAC project.  Unlike the more accommodating MIT and Stanford, however, which nurtured the Route 128 tech corridor and Silicon Valley respectively by encouraging professors and students to apply technologies developed in academia to the private sector, the Moore School believed commercial interests had no place in an academic institution and decided to quash them entirely.  In early 1946 the entire staff of the school was ordered to sign release forms giving up the rights to all patent royalties from inventions pioneered at the school.  This was intolerable to both Eckert and Mauchly, who formally resigned on March 31, 1946 to pursue commercial opportunities.

While still at the Moore School, Mauchly met with several organizations that might be interested in the new EDVAC computer.  One of these was the Census Bureau, which once again needed to migrate to new technologies as tabulating machines were no longer sufficient to count the U.S. population in a timely manner.  After leaving the school, Eckert and Mauchly attended a series of meetings with the Census Bureau and the National Bureau of Standards (NBS) between March and May devoted to the possibility of replacing tabulating machines with computers.  After further study, the NBS entered into an agreement with Eckert and Mauchly on September 25, 1946, for them to develop a computer for the Census Bureau in return for $300,000, which Eckert and Mauchly naively believed would cover a large portion of their R&D cost.

Census contract aside, Eckert and Mauchly experienced great difficulty attempting to fund the world’s first for-profit electronic computer company.  Efforts to raise capital commenced in the summer of 1946, but Philadelphia-area investors were focused on the older industries of steel and electric power that had driven the region for decades.  In New York, there was funding available for going electronic concerns, but the concept of venture capital did not yet exist and no investment houses were willing to take a chance on a startup.  The duo were finally forced to turn to friends and family, who provided enough capital in combination with the Census contract for Eckert and Mauchly to establish a partnership called the Electric Control Company in October 1946, which later incorporated as the Eckert-Mauchly Computer Corporation (EMCC) in December 1948.

As work began on the EDVAC II computer at the new Philadelphia offices of the Electric Control Company, the founders continued to seek new contracts to alleviate chronic undercapitalization.  In early 1947 Prudential, a forward-thinking company that had a reputation as an early adopter of new technology, agreed to pay the duo $20,000 to serve as consultants, but refused to commit to ordering a computer until it was completed.  Market research film A.C. Nielsen placed an order in spring 1948 and Prudential changed its mind and followed suit late in the year, but both deals were for $150,000 as Eckert and Mauchly continued to underestimate the cost of building their computers.  To keep the company solvent, the duo completed a $100,000 deal with Northrop Aircraft in October 1947 for a smaller scientific computer called the Binary Automatic Computer (BINAC) for use in developing a new unmanned bomber.  Meanwhile, with contracts coming in Eckert and Mauchly realized that they needed a new name for their computer to avoid confusion with the EDVAC project at the Moore School and settled on UNIVAC, which stood for Universal Automatic Computer.

EMCC appeared to finally turn a corner in August 1948 when it received a $500,000 investment from the American Totalisator Company.  The automatic totalisator was a specialized counting machine originally invented by New Zealander George Julius in the early twentieth century to tally election votes and divide them properly among the candidates.  When the government rejected the device, he adapted it for use at the race track, where it could run a pari-mutual betting system by totaling all bets and assigning odds to each horse.  American Totalisator came to dominate this market after one of its founders, Henry Strauss, invented and patented an electro-mechanical totalisator first used in 1933.  Strauss realized that electronic computing was the logical next step in the totalisator field, so he convinced the company board to invest $500,000 in EMCC in return for a 40% stake in the company.  With the funding from American Totalisator, EMCC completed BINAC and delivered it to Northrop in September 1949.  Although it never worked properly, BINAC was the first commercially sold computer in the world.  Work continued on UNIVAC as well, but disaster struck on October 25, 1949, when Henry Strauss died in a plane crash.  With EMCC’s chief backer at American Totalisator gone, the company withdrew its support and demanded that its loans be repaid.  Eckert and Mauchly therefore began looking for a buyer for their company.

On February 15, 1950, office equipment giant Remington Rand purchased EMCC for $100,000 while also paying off the $438,000 owed to American Totalisator.  James Rand, Jr., the president of the company, had become enamored with the scientific advances achieved during World War II and was in the midst of a post-war expansion plan centered on high technology and electronic products.  In 1946, Rand constructed a new high-tech R&D lab in Norwalk, Connecticut, to explore products as varied as microfilm readers, xerographic copiers, and industrial television systems.  In late 1947, he hired Leslie Groves, the general who oversaw the Manhattan Project, to run the operation.  EMCC therefore fit perfectly into Rand’s plans.  Though Eckert and Mauchly were required to give up their ownership stakes and take salaries as regular employees of Remington Rand, Groves allowed them to remain in Philadelphia and generally let them run their own affairs without interference.

With Remignton Rand sorting out its financial problems, EMCC was finally able to complete its computer.  First accepted by the U.S. Census Bureau on March 31, 1951, the UNIVAC I contained 5,200 vacuum tubes and could perform 1,905 operations a second at a clock speed of 2.25 MHz.  Like the EDVAC and EDSAC, the UNIVAC I used delay line memory as its primary method of storing information, but it also pioneered the use of magnetic tape storage as a secondary memory, which was capable of storing up to a million characters.  The Census Bureau resisted attempts by Remington Rand to renegotiate the purchase price of the computer and spent only the $300,000 previously agreed upon, while both A.C. Nielsen and Prudential ultimately cancelled their orders when Remington Rand threatened to tie up delivery through a lawsuit to avoid selling the computers for $150,00 dollars; future customers were forced to pay a million dollars or more for a complete UNIVAC I.

By 1954, nineteen UNIVAC computers had been purchased and installed at such diverse organizations as the Pentagon, U.S. Steel, and General Electric.  Most of these organizations took advantage of the computer’s large tape storage capacity to employ the computer for data processing rather than calculations, where it competed with the tabulating machines that had brought IBM to prominence.

UNIVAC-1101BRL61-0901

The UNIVAC 1101, Remington Rand’s first scientific computer

To serve the scientific community, Remington Rand turned to another early computer startup, Engineering Research Associates (ERA).  ERA grew out of the code-breaking activities of the United States Navy during World War II, which were carried out primarily through an organization called the Communications Supplementary Activity – Washington (CSAW).  Like Bletchley Park in the United Kingdom, CSAW constructed a number of sophisticated electronic devices to aid in codebreaking, and the Navy wanted to maintain this technological capability after the war.  Military budget cuts made this impractical, however, so to avoid losing the assembly of talent at CSAW, the Navy helped establish ERA in St. Paul, Minnesota, in January 1946 as a private corporation.  The company was led by John Parker, a former Navy lieutenant who had become intimately involved in the airline industry in the late 1930s and 1940s while working for the D.C. investment firm Auchincloss, Parker, and Redpath, and drew most of its important technical personnel from CSAW.

Unlike EMCC, which focused on building a machine for corporate data processing, ERA devoted its activities to intelligence analysis work for the United States Navy.  Like Eckert and Mauchly, the founders of ERA realized the greatest impediment to building a useful electronic computing device was the lack of suitable storage technology, so in its first two years of existence, the company concentrated on solving this problem, ultimately settling on magnetic drum memory, a technology invented by Austrian Gustav Tauchek in 1932 in which a large metal cylinder is coated with a ferromagnetic magnetic material.  As the drum is rotated, stationary write heads can generate an electrical pulse to change the magnetic orientation on any part of the surface of the drum, while a read head can detect the orientation and recognize it in binary as either a “1” or a “0,” therefore making it suitable for computer memory.  A series of specialized cryptoanalytic machines followed with names like Goldberg and Demon, but these machines tended to become obsolete quickly since they were targeted at specific codes and were not programmable to take on new tasks.  Meanwhile, as both ERA and the Navy learned more about developments at the Moore School, they decided a general purpose computer would be a better method of addressing the Navy’s needs than specialized equipment and therefore initiated Task 13 in 1947 to build a stored program computer called Atlas.  Completed in December 1950, the Atlas contained 2,700 vacuum tubes and a drum memory that could hold just over 16,000 24-bit words.  The computer was delivered to the National Security Agency (NSA) for code-breaking operations, and the agency was so pleased with the computer that it accepted a second unit in 1953.  In December 1951, a modified version was made available as the ERA 1101 — a play on the original project name as “1101” is “13” in binary — but ERA did not furnish any manuals, so no businesses purchased the machine.

The same month ERA announced the 1101, it was purchased by Remington Rand.  ERA president John Parker realized that fully entering the commercial world would require a significant influx of capital that the company would be unlikely to raise.  Furthermore, the close relationship between ERA and the Navy had piqued the interest of government auditors and threatened the company’s ability to secure future government contracts.  Therefore, Parker saw the Remington Rand purchase as essential to ERA’s continued survival.  Remington Rand, meanwhile, gained a foothold in a new segment of the computer market.  The company began marketing an improved version of ERA’s first computer as the UNIVAC 1103 in October 1953 and ultimately installed roughly twenty of them, mostly within the military-industrial complex.

In 1952, the American public was introduced to the UNIVAC in dramatic fashion when Mauchly developed a program to predict the results of the general election between Dwight Eisenhower and Adlai Stevenson based on the returns from the previous two elections.  The results were to be aired publicly on CBS, but UNIVAC predicted a massive landslide for Eisenhower in opposition to Gallup polls that indicated a close race.  CBS refused to deliver the results, opting instead to state that the computer predicted a close victory for Eisenhower.  When it became clear that Eisenhower would actually win in a landslide, the network owned up to its deception and aired the true results, which were within just a few electoral votes of the actual total.  Before long, the term “UNIVAC” became a generic word for all computers in the same way “Kleenex” has become synonymous with tissue paper and “Xerox” with photocopying.  For a time, it appeared that Remington Rand would be the clear winner in the new field of electronic computers, but only until IBM finally hit its stride.

IBM Enters the Computer Industry

IBM701Console

Tom Watson, Sr. sits at the console of an IBM 701, the company’s first commercial computer

There is a story, oft-repeated, about Tom Watson, Sr. that claims he saw no value in computers.  According to this story, the aging president of IBM scoffed that there would never be a market for more than five computers and neglected to bring IBM into the new field.  Only after the debut of the UNIVAC I did IBM realize its mistake and hastily enter the computer market.  While there are elements of truth to this version of events, there is no truth to the claim that IBM was completely ignoring the computer market in the late 1940s.  Indeed, the company developed several electronic calculators and had no fewer than three computer projects underway when the UNIVAC I hit the market.

As previously discussed, IBM’s involvement with computers began when the company joined with Howard Aiken to develop the Automatic Sequence Controlled Calculator (ASCC).  That machine was first unveiled publicly on August 6, 1944, and Tom Watson traveled to Cambridge, Massachusetts, to speak at the dedication.  At the Boston train station, Watson was irked that no one from Harvard was there to welcome him.  Irritation turned to rage when he perused the Boston Post and saw that Harvard had not only issued a press release about the ASCC without consulting him, but also gave sole credit to Howard Aiken for inventing the machine.  When an angry and humiliated Watson returned to IBM, he ordered James Bryce and Clair Lake to develop a new machine that would make Aiken’s ASCC look like a toy.  Watson wanted to show the world that IBM could build computers without help from anyone else and to get revenge on the men he felt wronged him.

With IBM seriously engaged in war work, Bryce and Lake felt they would be unable to achieve the breakthroughs in the lab necessary to best Aiken in a reasonable time frame, so instead argued for a simpler goal of creating the world’s first automatic calculator.  To that end, an electronics enthusiast in the company named Haley Dickinson was ordered to convert the company’s electro-mechanical Model 601 Multiplying Punch into a tube-based machine.  Unveiled in September 1946 as the IBM 603 Electronic Multiplier, the machine contained only 300 vacuum tubes and no storage, but it could multiply ten times faster than existing tabulating machines and soon became a sensation.  Embarrassed by the limitations of the machine, however, Watson halted production at 100 units and ordered his engineers to develop an improved model.  Ralph Palmer, an electronics expert that joined IBM in 1932 and was recently returned from a stint in the Navy, was asked to form a new laboratory in Poughkeepsie, New York, dedicated solely to electronics.  Palmer’s group delivered the IBM 604 Electronic Calculating Punch in 1948, which contained 1,400 tubes and could be programmed to solve simple equations.  Over the next ten years, the company leased 5,600 604’s to customers, and Watson came to realize that the future of IBM’s business lay in electronics.

Meanwhile, as World War II neared its conclusion, Watson’s mandate to best Aiken’s ASCC gained momentum.  The man responsible for this project was Wallace Eckert (no relation to the ENIAC co-inventor), who as an astronomy professor at Columbia in the 1920s and 1930s had been one of the main beneficiaries of Watson’s relationship with the university in those years.  After directing the Nautical Almanac of the United States Naval Observatory during much of World War II, Eckert accepted an invitation from Watson in March 1945 to head a new division within IBM specifically concerned with the computational needs of the scientific community called the Pure Science Department.

Eckert remained at headquarters in New York while Frank Hamilton, who had been a project leader on the ASCC, took charge of defining the Aiken-beating machine’s capabilities in Endicott.  In summer 1945, Eckert made new hire Rex Seeber his personal representative to the project.  A Harvard graduate, Seeber had worked with Aiken, but fell out with him when he refused to implement the stored program concept in his forthcoming update of the ASCC.  Seeber’s knowledge of computer theory and electronics perfectly complemented Hamilton’s electrical engineering skills and resulted in the completion of the Selective Sequence Electronic Calculator (SSEC) in 1947.  The SSEC was the first machine in the world to successfully implement the stored program concept, although it is often classified as a calculator rather than a stored program computer due to its limited memory and reliance on paper tape for program control.  The majority of the calculator remained electromechanical, but the arithmetic unit, adapted from the 603, operated at electronic speeds.  Built with 21,400 relays and 12,500 vacuum tubes and assembled at a cost of $950,000, the SSEC was a strange hybrid that exerted no influence over the future of computing, but it did accomplish IBM’s objectives by operating 250 times faster than the Harvard ASCC while also gaining significant publicity for IBM’s computing endeavors by operating while on display to the public on the ground floor of the company’s corporate headquarters from 1948 to 1952.

6703PH02

Tom Watson, Jr., son and successor of Tom Watson, Sr.

The success of the IBM 603 and 604 showed Watson that IBM needed to embrace electronics, but he remained cautious regarding electronic computing.  Indeed, when given the chance to bring Eckert and Mauchly into the IBM fold in mid-1946 after they left the Moore School, Watson ultimately turned them down not because he saw no value in their work but because he did not want to meet the price they demanded to buy out their business.  When he learned that the duo’s computer company was garnering interest from the National Bureau of Standards and Prudential in 1947, he told his engineers they should explore a competing design, but he was thinking in terms of a machine tailored to the needs of specific clients rather than a general-purpose computing device.  By now Watson was in his seventies and set in his ways, and while there is no evidence that he ever uttered the famous line about world demand reaching only five computers, he could simply not envision a world in which electronic computers replaced tabulating machines entirely.  As a result, the push for computing within the company came instead from his son and heir apparent, Tom Watson, Jr.

Thomas J. Watson, Jr. was born in Dayton, Ohio, in 1914, the same year his father accepted the general manager position at C-T-R.  His relationship with his father was strained for most of his life, as the elder Watson was prone to both controlling behavior and ferocious bursts of temper.  While incredibly bright, Watson suffered from anxiety and crippling depression as a child and felt incapable of living up to his father’s standards or of succeeding him at IBM one day, which he sensed was his father’s wish.  As a result, he rebelled and performed poorly in school, only gaining admittance to Brown University as a favor to his father.  After graduating with a degree in business in 1937, he became a salesman at IBM, but grew to hate working there due to the special treatment he received as the CEO’s son and the cult of personality that had grown up around his father.  Desperate for a way out, he joined the Air National Guard shortly before the United States entered World War II and became aide-de-camp to First Air Force Commander Major General Follett Bradley in 1942.  He had no intention of ever returning to IBM.

Working for General Bradley, Watson finally realized his own potential.  He became the general’s most trusted subordinate and gained experience managing teams undertaking difficult tasks.  With the encouragement of Bradley, his inner charisma surfaced for the first time, as did a remarkable ability to focus on and explain complex problems.  Near the end of the war, Bradley asked Watson about his plans for the future and was shocked when Watson said he might become a commercial pilot and would certainly never rejoin IBM.  Bradley stated that he always assumed Watson would return to run the company.  In that moment, Watson realized he was avoiding the company because he feared he would fail, but that his war experiences had prepared him to succeed his father.  On the first business day of 1946, he returned to the fold.

Tom Jr. was not promoted to a leadership position right away.  Instead, Tom Sr. appointed him personal assistant to Charley Kirk, the executive vice president of the company and Tom Sr.’s most trusted subordinate.  Kirk generously took Tom Jr. under his wing, but he also appeared to be first in line to take over the company upon Tom Sr.’s retirement, which Tom Jr. resented.  A potential power struggle was avoided when Kirk suffered a massive heart attack and died in 1947.  Tom Sr. did not feel his son was quite ready to assume the executive vice president position, but Tom Jr. did assume many of Kirk’s responsibilities while an older loyal Watson supporter named George Phillips took on the executive VP role on a short-term basis.  In 1952, Tom Sr. finally named Tom Jr. president of IBM.

ibm-650-drum

The IBM 650, IBM’s most successful early computer

Tom Jr. first learned of the advances being made in computing in 1946 when he and Kirk traveled to the Moore School to see the ENIAC.  He became a staunch supporter of electronics and computing from that day forward.  While there was no formal division of responsibilities drawn up between father and son, it was understood from the late forties until Tom Jr. succeeded his father as IBM CEO in 1956 that Tom Jr. would be given free reign to develop IBM’s electronics and computing businesses, while Tom Sr. concentrated on the traditional tabulating machine business.  In this capacity, Tom Jr. played a significant role in overcoming bias within IBM’s engineering, sales, and future demands divisions towards new technologies and brought IBM fully into the computer age.

By 1950, IBM had two computer projects in progress.  The first had been started in 1948 when Tom Watson, Sr. ordered his engineers to adapt the SSEC into something cheaper that could be mass produced and sold to IBM’s business customers.  With James Bryce incapacitated — he would die the next year — the responsibility of shaping the new machine fell to Wallace Eckert, Frank Hamilton, and John McPherson, an IBM vice president that had been instrumental in constructing two powerful relay calculators for the Aberdeen Proving Grounds during World War II.  The trio decided to create a machine focused on scientific and engineering applications, both because this was their primary area of expertise and because with the dawn of the Cold War the United States government was funding over a dozen scientific computing projects to maintain the technological edge it had built during World War II.  There was a real fear that if IBM did not stay relevant in this area, one of these projects could birth a company capable of challenging IBM’s dominant position in business machines.

Hamilton acted as the chief engineer on the project and chose to increase the system’s memory capacity by incorporating magnetic drum storage, thus leading to the machine’s designation as the Magnetic Drum Calculator (MDC). While the MDC began life as a calculator essentially pairing an IBM 603 with a magnetic drum, the realization that drum memory was expansive enough that a paper tape reader could be discarded entirely and instructions could be read and modified directly from the drum itself caused the project to morph into a full-fledged computer.  By early 1950, engineering work had commenced on the MDC, but development soon stalled as it became the focus of fights between multiple engineering teams as well as the sales and future demands departments over its specifications, target audience, and potential commercial performance.

While work continued on the MDC in Endicott, several IBM engineers in the electronics laboratory in Poughkeepsie initiated their own experiments related to computer technology.  In 1948, an engineer named Philip Fox began studying alternate solutions to vacuum tube memory that would allow for a stored-program computer.  Learning of the Williams Tube in 1948, he decided to focus his attention on CRT memory.  Fox created a machine called the Test Assembly on which he worked to improve on the reliability of existing CRT memory solutions.  Meanwhile, in early 1949, a new employee named Nathaniel Rochester who was dismayed that IBM did not already have a stored-program computer in production began researching the capabilities of magnetic tape as a storage medium.  These disparate threads came together in October 1949 when a decision was made to focus on the development of a tape machine to challenge the UNIVAC, which appeared poised to grab a share of IBM’s data processing business.  By March 1950,  Rochester and Werner Buchholz had completed a technical outline of the Tape Processing Machine (TPM), which would incorporate both CRT and tape memory.  As with the MDC, however, sales and future demands’ inability to clearly define a market for the computer hindered its development.

A breakthrough in the stalemate between sales and engineering finally occurred with the outbreak of the Korean War.  As he had when the United States entered World War II, Tom Watson, Sr. placed the full capabilities of the company at the disposal of the United States government.  The United States Air Force quickly responded that it wanted help developing a new electro-mechanical bombsight for the B-47 Bomber, but Tom Watson, Jr., who already believed IBM was not embracing electronics fast enough, felt working on electro-mechanical projects to be a giant step backwards for the company.  Instead, he proposed developing an electronic computer suitable for scientific computation by government organizations and contractors.

Initially, IBM considered adapting the TPM for its new scientific computer project, but quickly abandoned the idea.  To save on cost, the engineering team of the TPM had decided to design the computer to process numbers serially rather than in parallel, which was sufficient for data processing, but made the machine too slow to meet the computational needs of the government.  Therefore, in September 1950 Ralph Palmer’s engineers drew up preliminary plans for a floating-point decimal computer hooked up to an array of tape readers and other auxiliary devices that would be capable of well over 10,000 operations a second and of storing 2000 thirteen-digit words in Williams Tube memory.  Watson Jr. approved this project in January 1950 under the moniker “Defense Calculator.”  With a tight deadline of Spring 1952 in place for the Defense Calculator so it would be operational in time to contribute to the war effort, Palmer realized the engineering team, led by Nathaniel Rochester and Jerrier Haddad, could not afford to start from scratch on the design of the new computer, so they decided to base the architecture on von Neumann’s IAS Machine.

ibm_702

The IBM 702, IBM’s first computer targeted at businesses

On April 29, 1952, Tom Watson, Sr. announced the existence of the Defense Calculator to IBM’s shareholders at the company’s annual meeting.  In December, the first completed model was installed at IBM headquarters in the berth occupied until then by the SSEC.  On April 7, 1953, the company staged a public unveiling of the Defense Calculator under the name IBM 701 Electronic Data Processing Machine four days after the first production model had been delivered to the Los Alamos National Laboratory in New Mexico.  By April 1955, when production ceased, IBM had completed nineteen installations of the 701 — mostly at government organizations and defense contractors like Boeing and Lockheed — at a rental cost of $15,000 a month.

The success of the 701 finally broke the computing logjam at IBM.  The TPM, which had been on the back burner as the Defense Calculator project gained steam, was redesigned for faster operation and announced in September 1953 as the IBM 702, although the first model was not installed until July 1955.  Unlike the 701, which borrowed the binary numeral system from the IAS Machine, the 702 used the decimal system as befit its descent from the 603 and 604 electronic calculators.  It also shipped with a newly developed high speed printer capable of outputting 1,000 lines per minute.  IBM positioned the 702 as a business machine to compete with the UNIVAC I and ultimately installed fourteen of them.  Meanwhile, IBM also reinstated the MDC project — which had stalled almost completely — in November 1952, which saw release in 1954 as the IBM 650.  While the drum memory used in the 650 was slower than the Williams Tube memory of the 701 and 702, it was also more reliable and cheaper, allowing IBM to lease the 650 at the relatively low cost of $3,250 a month.  As a result, it became IBM’s first breakout success in the computer field, with nearly 2,000 installed by the time the last one rolled off the assembly line in 1962.

IBM’s 700 series computers enjoyed several distinct advantages over the UNIVAC I and UNIVAC 1103 computers marketed by Remington Rand.  Technologically, Williams Tube memory was both more reliable and significantly faster than the mercury delay line memory and drum memory used in the UNIVAC machines, while the magnetic tape system developed by IBM was also superior to the one used by Remington Rand.  Furthermore, IBM designed its computers to be modular, making them far easier to ship and install than the monolithic UNIVAC system.  Finally, IBM had built one of the finest sales and product servicing organizations in the world, making it difficult for Remington Rand to compete for customers.  While UNIVAC models held a small 30 to 24 install base edge over the 700 series computers as late as August 1955, IBM continued to improve the 700 line through newly emerging technologies and just a year later moved into the lead with 66 700 series installations versus 46 UNIVAC installations.  Meanwhile, installations of the 650 far eclipsed any comparable model, giving IBM control of the low end of the computer market as well.  The company would remain the number one computer maker in the world throughout the mainframe era.

Advertisements

Historical Interlude: The Birth of the Computer Part 2, The Creation of the Electronic Digital Computer

In the mid-nineteenth century, Charles Babbage attempted to create a program-controlled universal calculating machine, but failed for lack of funding and the difficulty of creating the required mechanical components.  This failure spelled the end of digital computer research for several decades.  By the early twentieth century, however, fashioning small mechanical components no longer presented the same challenge, while the spread of electricity generating technologies provided a far more practical power source than the steam engines of Babbage’s day.  These advances culminated in just over a decade of sustained innovation between 1937 and 1949 out of which the electronic digital computer was born.  While both individual computer components and the manner in which the user interacts with the machine have continued to evolve, the desktops, laptops, tablets, smartphones, and video game consoles of today still function according to the same basic principles as the Manchester Mark 1, EDSAC, and EDVAC computers that first operated in 1949.  This blog post will chart the path to these three computers.

Note: This is the second of four “historical interlude” posts that will summarize the evolution of computer technology between 1830 and 1960.  The information in this post is largely drawn from Computer: A History of the Information Machine by Martin Campbell-Kelly and William Aspray, The Maverick and His Machine: Thomas Watson, Sr. and the Making of IBM by Kevin Maney, Reckoners: The Prehistory of the Digital Computer, From Relays to the Stored Program Concept, 1935-1945 by Paul Ceruzzi, The Innovaters: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution by Walter Isaacson, Forbes Greatest Technology Stories: Inspiring Tales of Entrepreneurs and Inventors Who Revolutionized Modern Business by Jeffrey Young, and the articles “Alan Turing: Father of the Modern Computer” by B. Jack Copeland and Diane Proudfoot, “Colossus: The First Large Scale Electronic Computer” by Jack Copeland, and “A Brief History of Computing,” also by Copeland.

Analog Computing

102680080-03-01

Vannevar Bush with his differential analyzer, an analog computer

While a digital computer after the example of Babbage would not appear until the early 1940s, specialized computing devices that modeled specific systems mechanically continued to be developed in the late nineteenth and early twentieth centuries.  These machines were labelled analog computers, a term derived from the word “analogy” because each machine relied on a physical model of the phenomenon being studied to perform calculations unlike a digital computer that relied purely on numbers.  The key component of these machines was the wheel-and-disc integrator, first described by James Thomson, that allowed integral calculus to be performed mechanically.  Perhaps the most important analog computer of the nineteenth century was completed by James’s brother William, better known to history as Lord Kelvin, in 1876.  Called the tide predictor, Kelvin’s device relied on a series of mechanical parts such as pulleys and gears to simulate the gravitational forces that produce the tides and measured the water depth of a harbor at any given time of day, printing the results on a roll of paper.  Before Lord Kelvin’s machine, creating tide tables was so time-consuming that only the most important ports were ever charted.  After Kelvin’s device entered general use, it was finally possible to complete tables for thousands of ports around the world.  Improved versions of Kelvin’s computer continued to be used until the 1950s.

In the United States, interest in analog computing began to take off in the 1920s as General Electric and Westinghouse raced to build regional electric power networks by supplying alternating-current generators to power plants.  At the time, the mathematical equations required to construct the power grids were both poorly understood and difficult to solve by hand, causing electrical engineers to turn to analog computing as a solution.  Using resistors, capacitors, and inducers, these computers could simulate how the network would behave in the real world.  One of the most elaborate of these computers, the AC Network Analyzer, was built at MIT in 1930 and took up an entire room.  With one of the finest electrical engineering schools in the country, MIT quickly became a center for analog computer research, which soon moved from highly specific models like the tide predictor and power grid machines to devices capable of solving a wider array of mathematical problems through the work of MIT professor Vannevar Bush.

One of the most important American scientists of the mid-twentieth century, Bush possessed a brilliant mind coupled with a folksy demeanor and strong administration skills.  These traits served him well in co-founding the American Appliance Company in 1922 — which later changed its name to Raytheon and became one of the largest defense contractors in the world — and led to his appointment in 1941 to head the new Office of Scientific Research and Development, which oversaw and coordinated all wartime scientific research by the United States government during World War II and was instrumental to the Allied victory.

Bush built his first analog computer in 1912 while a doctoral student at Tufts College.  Called the “profile tracer,” it consisted of a box hung between two bicycle wheels and would trace the contours of the ground as it was rolled.  Moving on to MIT in 1919, Bush worked on problems involving electric power transmission and in 1924 developed a device with one of his students called the “product integraph” to simplify the solving and plotting of the first-order differential equations required for that work.  Another student, Harold Hazen, suggested this machine be extended to solve second-order differential equations as well, which would make the device useful for solving a wide array of physics problems.  Bush immediately recognized the potential of this machine and worked with Hazen to build it between 1928 and 1931.  Bush called the resulting machine the “differential analyzer.”

The differential analyzer improved the operation of Thomson’s wheel-and-disc integrator through a device called a torque amplifier, allowing it to mechanically model, solve, and plot a wider array of differential equations than any analog computer that came before, but it still fell short of the Babbage ideal of a general-purpose digital device.  Nevertheless, the machine was installed at several universities, corporations, and government laboratories and demonstrated the value of using a computing device to perform advanced scientific calculations.  It was therefore an important stepping stone on the path to the digital computer.

Electo-Mechanical Digital Computers

23593-004-D5156F2C

The Automatic Sequence Controlled Calculator (ASCC), also known as the Harvard Mark I, the first proposed electro-mechanical digital computer, though not the first completed

With problems like power network construction requiring ever more complex equations and the looming threat of World War II requiring world governments to compile large numbers of ballistics tables and engage in complex code-breaking operations, the demand for computing skyrocketed in the late 1930s and early 1940s.  This led to a massive expansion of human computing and the establishment of the first for-profit calculating companies, beginning with L.J. Comrie’s Scientific Computing Services Limited in 1937.  Even as computing services were expanding, however, the armies of human computers required for wartime tasks were woefully inadequate for completing necessary computations in a timely manner, while even more advanced analog computers like the differential analyzer were still too limited to carry out many important tasks.  It was in this environment that researchers in the United States, Great Britain, and Germany began attempting to address this computing shortfall by designing digital calculating machines that worked similarly to Babbage’s Analytical Engine but made use of more advanced components not available to the British mathematician.

The earliest digital calculating machines were based on electromechanical relay technology.  First developed in the mid nineteenth century for use in the electric telegraph, a relay consists in its simplest form of a coil of wire, an armature, and a set of contacts.  When a current is passed through the coil, a magnetic field is generated that attracts the armature and therefore draws the contacts together, completing a circuit.  When the current is removed, a spring causes the armature to return to the open position.  Electromechanical relays played a crucial role in the telephone network in the United States, routing calls between different parts of the network.  Therefore, Bell Labs, the research arm of the telephone monopoly AT&T, served as a major hub for relay research and was one of the first places where the potential of relays and similar switching units for computer construction was first contemplated.

The concept of the binary digital circuit, which continues to power computers to this day, was independently articulated and applied by several scientists and mathematicians in the late 1930s.  Perhaps the most influential of these thinkers — due to his work being published and widely disseminated — was Claude Shannon.  A graduate of the University of Michigan with degrees in electrical engineering and math, Shannon matriculated to MIT, where he secured a job helping Bush run his Differential Analyzer.  In 1937, Shannon took a summer job at Bell Labs, where he gained hands on experience with the relays used in the phone network and connected their function with another interest of his — the symbolic logic system created by mathematician George Boole in the 1840s.

Basically, Boole had discovered a way to represent formal logical statements mathematically by giving a true proposition a value of 1 and a false proposition a value of 0 and then constructing mathematical equations that could represent the basic logical operations such as “and,” “or” and “not.”  Shannon realized that since a relay either existed in an “on” or an “off” state, a series of relays could be used to construct logic gates that emulated Boolean logic and therefore carry out complex instructions, which in their most basic form are a series of “yes” or “no,” “on” or “off,” “1” or “0” propositions.  When Shannon returned to MIT that fall, Bush urged him to include these findings in his master’s thesis, which was published later that year under the name “A Symbolic Analysis of Relay and Switching Circuits.”  In November 1937, a Bell Labs researcher named George Stibitz, who was aware of Shannon’s theories, applied the concept of binary circuits to a calculating device for the first time when he constructed a small relay calculator he dubbed the K-Model because he built it at his kitchen table.  Based on this prototype, Stibitz received permission to build a full-sized model at Bell Labs, which was named the Complex Number Calculator and completed in 1940.  While not a full-fledged programmable computer, Stibitz’s machine was the first to use relays to perform basic mathematical operations and demonstrated the potential of relays and binary circuits for computing devices.

One of the earliest digital computers to use electromechanical relays was proposed by Howard Aiken in 1936.  A doctoral candidate in mathematics at Harvard University, Aiken needed to solve a series of non-linear differential equations as part of his dissertation, which was beyond the capabilities of Bush’s differential analyzer at neighboring MIT.  Unenthused by the prospect of solving these equations by hand, Aiken, who was already a skilled electrical engineer, proposed that MIT build a large-scale digital calculator to do the work.  The university turned him down, so Aiken approached the Monroe Calculating Machine Company, which also failed to see any value in the project.  Monroe’s chief engineer felt the idea had merit, however, and urged Aiken to approach IBM.

When last we left IBM in 1928, the company was growing and profitable, but lagged behind several other companies in overall size and importance.  That all changed with the onset of the Great Depression.  Like nearly every other business in the country, IBM was devastated by the market crash of 1929, but Tom Watson decided to boldly soldier on without laying off workers or cutting production, keeping his faith that the economy could not continue in a tailspin for long.  He also increased the company’s emphasis on R&D, building one of the world’s first corporate research laboratories to house all his engineers in Endicott, New York in 1932-33 at a cost of $1 million.  As the Depression dragged on, machines began piling up in the factories and IBM’s growth flattened, threatening the solvency of the company.  Watson’s gambles increasingly appeared to be a mistake, but then President Franklin Roosevelt began enacting his New Deal legislation.

In 1935, the United States Congress passed the Social Security Act.  Overnight, every company in the country was required to keep detailed payroll records, while the Social Security Administration had to keep a file on every worker in the nation.  The data processing burden of the act was enormous, and IBM, with its large stock of tabulating machines and fully operational factories, was the only company able to begin filling the demand immediately.  Between 1935 and 1937, IBM’s revenues rose from $19 million to $31 million and then continued to grow for the next 45 years.  The company was never seriously challenged in tabulating equipment again.

Traditionally, data processing revolved around counting tangible objects, but by the time Aiken approached IBM Watson had begun to realize that scientific computing was a natural extension of his company’s business activities.  The man who turned Watson on to this fact was Ben Wood, a Columbia professor who pioneered standardized testing and was looking to automate the scoring of his tests using tabulating equipment.  In 1928, Wood wrote ten companies to win support for his ideas, but only Watson responded, agreeing to grant him an hour to make his pitch.  The meeting began poorly as the nervous Wood failed to hold Watson’s interest with talk of test scoring, so the professor expanded his presentation to describe how nearly anything could be represented mathematically and therefore quantified by IBM’s machines.  One hour soon stretched to over five as Watson grilled Wood and came to see the value of creating machines for the scientific community.  Watson agreed to give Wood all the equipment he needed, dropped in frequently to monitor Wood’s progress, and made the professor an IBM consultant.  As a result of this meeting, IBM began supplying equipment to scientific labs around the world.

Aiken

Howard Aiken, designer of the Automatic Sequence Control Calculator

In 1937, Watson began courting Harvard, hoping to create the same kind of relationship he had long enjoyed with Columbia.  He dispatched an executive named John Phillips to meet with deans and faculty, and Aiken used the opportunity to introduce IBM to his calculating device.  He also wrote a letter to James Bryce, IBM’s chief engineer, who sold Watson on the concept.  Bryce assigned Clair Lake to oversee the project, which would be funded and built by IBM in Endicott according to Aiken’s design and then installed at Harvard.

Aiken’s initial concept basically stitched together a card reader, a multiplying punch, and a printer, removing human intervention in the process by connecting the components through electrical wiring and incorporating relays as switching units to control the passage of information through the parts of the machine.  Aiken drew inspiration from Babbage’s Analytical Enginge, which the mathematician first learned about soon after proposing his device when a technician informed him that the university actually owned a fragment of one of Babbage’s calculating machines that had been donated by the inventor’s son in 1886. Unlike Babbage, however, Aiken did not employ separate memory and computing elements, as all calculations were performed across a series of 72 accumulators that both stored and modified the data transmitted to them by the relays.  Without something akin to a CPU, the machine was actually less advanced than the Analytical Engine in that it did not support conditional branching — the ability to modify a program on the fly to incorporate the results of previous calculations — and therefore required all calculations to be done in a set sequence while requiring complex programs to use large instruction sets and long lines of paper tape.

Work began on the Automatic Sequence Control Calculator (ASCC) Mark I in 1939, but the onset of World War II resulted in the project being placed on the back burner as IBM shifted its focus to more important war work and Aiken entered the Navy.  It was finally completed in January 1943 at a cost of $500,000 and subsequently installed at Harvard in early 1944 after undergoing a year of testing in Endicott.  Measuring 8 feet tall and 51 feet long, the machine was housed in a gleaming metal case designed by Norman Bel Geddes, known for his art deco works such as the Metropolitan Opera House in New York.  By the time of its completion, the ASCC already lagged behind several other machines technologically and therefore did not play a significant role in the further evolution of the computer.  It is notable, however, both as the earliest proposed digital computer to actually be built and as IBM’s introduction to the world of computing.

zuse

Konrad Zuse, designer of the Z1, the first completed digital computer

While Howard Aiken was still securing support for his digital computer, a German named Konrad Zuse was busy completing one of his own.  Born in Berlin, Zuse spent most of his childhood in Braunsberg, East Prussia (modern Braniewo, Poland).  Deciding on a career as an engineer, he enrolled at the Technical College of Berlin-Charlottenburg in 1927.  While not particularly interested in mathematics, Zuse did have to work with complex equations to calculate the lode-bearing capability of structures, and like Aiken across the Atlantic he was not enthused at having to perform these calculations by hand.  Therefore, in 1935 Zuse began designing a universal automatic calculator consisting of a computing element, a storage unit, and a punched tape reader, independently arriving at the same basic design that Babbage had developed a century before.

While Zuse’s basic concept did not stray far from Babbage, however, he did incorporate one crucial improvement in his design that neither Babbage nor Aiken had considered, storing the numbers in memory according to a binary rather than a decimal system.  Zuse’s reason for doing so was practical — as an accomplished mechanical engineer he preferred keeping his components as simple as possible to make the computer easier to design and build — but the implications of this decision went far beyond streamlined memory construction.  Like Shannon, Zuse realized that by recognizing data in only two states, on and off, a computing device could represent not just numbers, but also instructions.  As a result, Zuse was able to use the same basic building blocks for both his memory and computing elements, simplifying the design further.

By 1938, Zuse had completed his first computer, a mechanical binary digital machine called the Z1. (Note: Originally, Zuse called this computer the V1 and continued to use the “V” designation on his subsequent computers.  After World War II, he began referring to these machines using the “Z” designation instead to avoid confusion with Germany’s V1 and V2 rockets.)  This first prototype was fairly basic, but it proved two things for Zuse: that he could create a working automatic calculating device and that the computing element could not be mechanical, as the components were just too unreliable.  The solution to this problem came from college friend Helmut Schreyer, an electrical engineer who convinced Zuse that the electrical relays used in telephone networks would provide superior performance.  Schreyer also worked as a film projectionist and convinced Zuse to switch from paper tape to punched film stock for program control.  These improvements were incorporated into the Z2 computer, completed in 1939, which never worked reliably, but was essential for securing funding for Zuse’s next endeavor.

Z3_1

A reconstruction of Konrad Zuse’s Z3, the world’s first programmable fully automatic digital computer

In 1941, Konrad Zuse completed the Z3 for the German government, the first fully operational digital computer in the world.  The computer consisted of two cabinets containing roughly 2,600 relays — 1,800 for memory, 600 for computing, and 200 for the tape reader — and a small display/keyboard unit for inputting programs.  With a memory of only 64 characters, the computer was too limited to carry out useful work, but it served as an important proof of concept and illustrated the potential of a programmable binary computer.

Unfortunately for Zuse, the German government proved disinterested in further research.  Busy fighting a war it was convinced would be over in just a year or two, the Third Reich limited its research activities to projects that could directly impact the war effort in the short-term and ignored the potential of computing entirely.  While Zuse continued to work on the next evolution of his computer design, the Z4, between 1942 and 1945, he did so on his own without the support of the Reich, which also turned down a computer project by his friend Schreyer that would have replaced relays with electronics.  Isolated from the rest of the developed world by the war, Zuse’s theories would have little impact on subsequent developments in computing, while the Z3 itself was destroyed in an Allied bombing raid on Berlin in 1943 before it could be studied by other engineers.  That same year, Great Britain’s more enthusiastic support of computer research resulted in the next major breakthrough in computing technology.

The Birth of the Electronic Computer

Colossus

Colossus, the world’s first programmable electronic computer

Despite the best efforts of Aiken and Zuse, relays were never going to play a large role in computing, as they were both unreliable and slow due to a reliance on moving parts.  In order for complex calculations to be completed quickly, computers would need to transition from electro-mechanical components to electronic ones, which function instead by manipulating a beam of electrons.

The development of the first electronic components grew naturally out of Thomas Edison’s work with the incandescent light bulb.  In 1880, Edison was conducting experiments to determine why the filament in his new incandescent lamps would sometimes break and noticed that an electrical charge would not flow through a negatively charged plate.  Although this effect had been observed by other scientists as early as 1873, Edison was the first to patent a voltage-regulating device based on this principle in 1883, which resulted in the phenomenon being named the “Edison effect.”

Edison, who did not have a solid grasp of the underlying science, did not follow up on his discovery.  In 1904, however, John Fleming, a consultant with the Marconi Company engaged in research relating to wireless telegraphy, realized that the Edison effect could be harnessed to create a device that would only allow the flow of electric current in one direction and thus serve as a rectifier that turned a weak alternating current into a stronger direct current.  This would in turn allow a receiver to be more sensitive to radio waves, thus making reliable trans-Atlantic wireless communication possible.  Based on his research, Fleming created the first diode, the Fleming Valve, in which an electric current was passed in one direction from a negatively-charged cathode to a positively-charged anode through a vacuum-sealed glass container.  The vacuum tube concept invented by Fleming remained the primary building block of electronic devices for the next fifty years.

In 1906, an American electrical engineer named Lee DeForest working independently of Fleming began creating his own series of electron tubes, which he called Audions.  DeForest’s major breakthrough was the development of the triode, which used a third electrode called a grid that could control the voltage of the current in the tube and therefore serve as an amplifier to boost the power of a signal.  DeForest’s tube contained gas at low pressure, which inhibited reliable operation, but by 1913 the first vacuum tube triodes had been developed.  In 1918, British physicists William Eccles and F.W. Jordan used two triodes to create the Eccles-Jordan circuit, which could flip between two states like an electrical relay and therefore serve as a switching device.

Even after the invention of the Eccles-Jordan circuit, few computer pioneers considered using vacuum tubes in their devices.  Conventional wisdom held they were unsuited for large-scale projects because a triode contains a filament that generates a great deal of heat and is prone to burnout.  Consequently, the failure rate would be unacceptable in a device requiring thousands of tubes.  One of the first people to challenge this view was a British electrical engineer named Thomas Flowers.

Tommy_Flowers

Tommy Flowers, the designer of Colossus

Born in London’s East End, Flowers, the son of a bricklayer, simultaneously took an apprenticeship in mechanical engineering at the Royal Armory, Woolwich, while attending evening classes at the University of London.  After graduating with a degree in electrical engineering, Flowers took a job with the telecommunications branch of the General Post Office (GPO) in 1926.  In 1930, he was posted to the GPO Research Branch at Dollis Hill, where he established a reputation as a brilliant engineer and achieved rapid promotion.

In the early 1930s, Flowers began conducting research into the use of electronics to replace relays in telephone switchboards.  Counter to conventional wisdom, Flowers realized that vacuum tube burnout usually occurred when a device was switched on and off frequently.  In a switchboard or computer, the vacuum tubes could remain in continuous operation for extended periods once switched on, thus greatly increasing their longevity.  Before long, Flowers began experimenting with equipment containing as many as 3,000 vacuum tubes.  Flowers would make the move from switchboards to computing devices with the onset of World War II.

With the threat of Nazi Germany rising in the late 1930s, the United Kingdom began devoting more resources to cracking German military codes.  Previously, this work had been carried out in London at His Majesty’s Government Code and Cypher School, which was staffed with literary scholars rather than cryptographic experts.  In 1938, however, MI6, the British Intelligence Service, purchased a country manor called Bletchley Park, near the intersection of the rail lines connecting Oxford and Cambridge and London and Birmingham, to serve as a cryptographic and code-breaking facility.  The next year, the government began hiring mathematicians to seriously engage in code-breaking activities.  The work conducted at the manor has been credited with shortening the war in Europe and saving countless lives. It also resulted in the development of the first electronic computer.

Today, the Enigma Code, broken by a team led by Alan Turing, is the most celebrated of the German ciphers decrypted at Bletchley, but this was actually just one of several systems used by the Reich and was not even the most complicated.  In mid-1942, Germany initiated general use of the Lorenz Cipher, which was reserved for messages between the German High Command and high-level army commands, as the encryption machine — which the British code-named “Tunny” — was not easily portable like the Enigma Machine.  In 1942, Bletchley established a section dedicated to breaking the cipher, and by November a system called the “statistical method” had been developed by William Tutte to crack the code, which built on earlier work by Turing.  When Tutte presented his method, mathematician Max Newman decided to establish a new section — soon labelled the Newmanry — to apply the statistical method with electronic machines.  Newman’s first electronic codebreaking machine, the Heath Robinson, was both slow and unreliable, but it worked well enough to prove that Newman was on the right track.

Meanwhile, Flowers joined the code-breaking effort in 1941 when Alan Turing enlisted Dollis Hill to create some equipment for use in conjunction with the Bombe, his Enigma-cracking machine.  Turing was greatly impressed by Flowers, so when Dollis Hill encountered difficulty crafting a combining unit for the Heath Robison, Turing suggested that Flowers be called in to help.  Flowers, however, doubted that the Heath Robisnon would ever work properly, so in February 1943 he proposed the construction of an electronic computer to do the work instead.  Bletchley Park rejected the proposal based on existing prejudices over the unreliability of tubes, so Flowers began building the machine himself at Dollis Hill.  Once the computer was operational, Bletchley saw the value in it and accepted the machine.

Installed at Bletchley Park in January 1944, Flowers’s computer, dubbed Colossus, contained 1600 vacuum tubes and processed 5,000 characters per second, a limit imposed not by the speed of the computer itself, but rather by the speed that the reader could safely operate without risk of destroying the magnetic tape.  In June 1944, Flowers completed the first Colossus II computer, which contained 2400 tubes and used an early form of shift register to perform five simultaneous operations and therefore operated at a speed of 25,000 characters per second.  The Colossi were not general purpose computers, as they were dedicated solely to a single code-breaking operation, but they were program-controlled. Unlike electro-mechanical computers, however, electronic computers process information too quickly to accept instructions from punched cards or paper tape, so the Colossus actually had to be rewired using plugs and switches to run a different program, a time-consuming process.

As the first programmable electronic computer, Colossus was an incredibly significant advance, but it ultimately exerted virtually no influence on future computer design.  By the end of the war, Bletchley Park was operating nine Colossus II computers alongside the original Colossus to break Tunny codes, but after Germany surrendered, Prime Minister Winston Churchill ordered the majority of the machines dismantled and kept the entire project classified.  It was not until the 1970s that most people knew that Colossus had even existed, and the full function of the machine remained unknown until 1996.  Therefore, instead of Flowers being recognized as the inventor of the electronic computer, that distinction was held for decades by a group of Americans working at the Moore School of the University of Pennsylvania.

ENIAC

ENIAC_Image_2

The Electronic Numerical Integrator and Computer (ENIAC), the first widely known electronic computer

In 1935, the United States Army established a new Ballistic Research Laboratory (BRL) at the Aberdeen Proving Grounds in Maryland dedicated to calculating ballistics tables for artillery.  With modern guns capable of lofting projectiles at targets many miles away, properly aiming them required the application of complex differential equations, so the BRL assembled a staff of thirty to create trajectory tables for various ranges, which would be compiled into books for artillery officers.  Aberdeen soon installed one of Bush’s differential analyzers to help compute the tables, but the onset of World War II overwhelmed the lab’s capabilities.  Therefore, it began contracting some of its table-making work with the Moore School, the closest institution with its own differential analyzer.

The Moore School of Electrical Engineering of the University of Pennsylvania owned a fine reputation, but it carried nowhere near the prestige of MIT and therefore did not receive the same level of funding support from the War Department for military projects.  It did, however, place itself on a war footing by accelerating degree programs through the elimination of vacations and instituting a series of war-related training and research programs.  One of these was the Engineering, Science, Management, War Training (ESMWT) program, an intensive ten-week course designed to familiarize physicists and mathematicians with electronics to address a manpower shortfall in technical fields.  One of the graduates of this course was a physics instructor at a nearby college named John Mauchly.

Born in Cincinnati, Ohio, John William Mauchly grew up in Chevy Chase, Maryland, after his physicist father became the research chief for the Department of Terrestrial Magnetism of the Carnegie Insitution, a foundation established in Washington, D.C. to support scientific research around the country.  Sebastien Mauchly specialized in recording atmospheric electrical conditions to further weather research, so John became particularly interested in meteorology.  After completing a Ph.D. at Johns Hopkins University in 1932, Mauchly took a position at Ursinus College, a small Philadelphia-area institution, where he studied the effects of solar flares and sunspots on long-range weather patterns.  Like Aiken and Zuse before him, Mauchly grew tired of solving the complex equations required for his research and began to dream of building a machine to automate this process.  After viewing an IBM electric calculating machine and a vacuum tube encryption machine at the 1939 World’s Fair, Mauchly felt electronics would provide the solution, so he began taking a night course in electronics and crafting his own experimental circuits and components.  In December 1940, Moore gave a lecture articulating his hopes of building a weather prediction computer to the American Association for the Advancement of Science.  After the lecture, he met an Iowa State College professor named John Atanasoff, who would play an important role in opening Mauchly to the potential of electronics by inviting him out to Iowa State to study a computer project he had been working on for several years.

atanasoff-berry-computer

The Atanasoff-Berry Computer (ABC), the first electronic computer project, which was never completed

A graduate of Iowa State College that earned a Ph.D. in theoretical physics from the University of Wisconsin-Madison in 1930, John Atanasoff, like Howard Aiken, was drawn to computing due to the frustration of solving equations for his dissertation.  In the early 1930s, Atanasoff experimented with tabulating machines and analog computing to make solving complex equations easier, culminating in a decision in December 1937 to create a fully automatic electronic digital computer.  Like Shannon and Zuse, Atanasoff independently arrived at binary digital circuits as the most efficient way to do calculations, remembering childhood lessons by his mother, a former school teacher, on calculating in base 2.  While he planned to use vacuum tubes for his calculating circuits, however, he rejected them for storage due to cost.  Instead, he developed a system in which paper capacitors would be attached to a drum that could be rotated by a bicycle chain.  By keeping the drums rotating so that the capacitors would sweep past electrically charged brushes once per second, Atanasoff believed he would be able to keep the capacitors charged and therefore create a low-cost form of electronic storage.  Input and output would be accomplished through punch cards or paper tape.  Unlike most of the other computer pioneers profiled so far, Atanasoff was only interested in solving a specific set of equations and therefore hardwired the instructions into the machine, meaning it would not be programmable.

By May 1939, Atanasoff was ready to put his ideas into practice, but he lacked electrical engineering skills himself and therefore needed an assistant to actually build his computer.  After securing a $650 grant from the Iowa State College Research Council, Atanasoff hired a graduate student recommended by one of his colleagues named Clifford Berry.  A genius who graduated high school at sixteen, Berry had been an avid ham radio operator in his youth and worked his way through college at Iowa State as a technician for a local company called Gulliver Electric.  He graduated in 1939 at the top of his engineering school class.  The duo completed a small-scale prototype of Atanasoff’s concept in late 1939 and then secured $5,330 from a private foundation to begin construction of what they named the Atanasoff-Berry Computer (ABC), the first electronic computer to employ separate memory and computing elements and a binary system for processing instructions and storing data, predating Colossus by just a few years.  By 1942, the ABC was nearly complete, but it remained unreliable and was ultimately abandoned when Atanasoff left Iowa State for a wartime posting with the Naval Ordinance Laboratory.  With no other champion at the university, the ABC was cannibalized for parts for more important wartime projects, after which the remains were placed in a boiler room and forgotten.  Until a patent lawsuit brought renewed attention to the computer in the 1960s, few were aware the ABC had ever existed, but in June 1941 Mauchly visited Atanasoff and spent five days learning everything he could about the machine.  While there is still some dispute regarding how influential the ABC was on Mauchly’s own work, there is little doubt that at the very least the computer helped guide his own thoughts on the potential of electronics for computing.

Upon completing the ESMWT at the Moore School, Mauchly was offered a position on the school’s faculty, where he soon teamed with a young graduate student he met during the course to realize his computer ambitions.  John Presper Eckert was the only son of a wealthy real estate developer from Philadelphia and an electrical engineering genius who won a city-wide science fair at twelve years old by building a guidance system for model boats and made money in high school by building and selling radios, amplifiers, and sound systems.  Like Tommy Flowers in England, Eckert was a firm believer in the use of vacuum tubes in computing projects and worked with Mauchly to upgrade the differential analyzer by using electronic amplifiers to replace some of its components.  Meanwhile, Mauchly’s wife was running a training program for human computers, which the university was employing to work on ballistics tables for the BRL.  Even with the differential analyzer working non-stop and over two hundred human computers doing calculations by hand, a complete table of roughly 3,000 trajectories took the BRL thirty days to complete.  Mauchly was uniquely positioned in the organization to understand both the demands being placed on Moore’s computers and the technology that could greatly increase the efficiency of their work.  He therefore drafted a memorandum in August 1942 entitled “The Use of High Speed Vacuum Devices for Calculating” in an attempt to interest the BRL in greatly speeding up artillery table creation through use of an electronic computer.

Mauchly submitted his memorandum to both the Moore School and the Army Ordinance Department and was ignored by both, most likely due to the continued skepticism over the use of vacuum tubes in large-scale computing projects.  The paper did catch the attention of one important person, however, Lieutenant Herman Goldstine, a mathematics professor from the University of Chicago currently serving as the liaison between the BRL and the Moore School human computer training program.  While not one of the initial recipients of the memo, Goldstine became friendly with Mauchly in late 1942 and learned of the professor’s ideas.  Aware of the acute manpower crisis faced by the BRL for creating its ballistic tables, Goldstine urged Mauchly to resubmit his memo and promised he would use all his influence to aid its acceptance.  Therefore, in April 1943, Mauchly submitted a formal proposal for an electronic calculating machine that was quickly approved and given the codename “Project PX.”

g

John Mauchly (right) and J. Presper Eckert, the men behind ENIAC

Eckert and Mauchly began building the Electronic Numerical Integrator and Computer (ENIAC) in autumn 1943 with a team of roughly a dozen engineers.  Mauchly remained the visionary of the project and was largely responsible for defining its capabilities, while the brilliant engineer Eckert turned that vision into reality.  ENIAC was a unique construction that had more in common with tabulating machines than later electronic computers, as the team decided to store numbers in decimal rather than binary and stored and modified numbers in twenty accumulators, therefore failing to separate the memory and computing elements.  The machine was programmable, though like Colossus this could only be accomplished through rewiring, as the delay of waiting for instructions to be read from a tape reader was unacceptable in a machine operating at electronic speed.  The computer was powerful for its time, driven by 18,000 vacuum tubes, 70,000 resistors, 10,000 capacitors, 6,000 switches, and 1,500 relays, and could output a complete artillery table in just fifteen minutes.  The entire computer took up 1,800 square feet of floor space, consumed 150 kilowatts of power, and generated an enormous amount of heat.  Costing roughly $500,000, ENIAC was completed in November 1945 and successfully ran its first program the following month.

Unlike the previously discussed Z3, Colossus, and ABC computers, the ENIAC was announced to the general public with much fanfare in February 1946, was examined by many other scientists and engineers, and became the subject of a series of lectures held at the Moore School over eight weeks in the summer of 1946 in which other aspiring computer engineers could learn about the machine in detail.  While it was completed too late to have much impact on the war effort and exerted virtually no influence on future computers from a design perspective, the ENIAC stands as the most important of the early computers because it proved to the world at large that vacuum tube electronic computers were possible and served as the impetus for later computer projects.  Indeed, even before the ENIAC had been completed, Eckert and Mauchly were moving on to their next computer concept, which would finally introduce the last important piece of the computer puzzle: the stored program.

The First Stored Program Computers

Manchester_Mark2

The Manchester Small-Scale Experimental Machine (SSEM), the first stored-program computer to successfully run a program

As previously discussed, electronic computers like the Colossus and ENIAC were limited in their general utility because they could only be configured to run a different program by actually rewiring the machine, as there were no input devices capable of running at electronic speeds.  This bottleneck could be eliminated, however, if the programs themselves were also stored in memory alongside the numbers they were manipulating.  In theory, the binary numeral system made this feasible since the instructions could be represented through symbolic logic as a series of “yes or no,” “on or of,” “1 or 0” propositions, but in reality the amount of storage needed would overwhelm the current technology.  The mighty ENIAC with its 18,000 vacuum tubes could only store 200 characters in memory.  This was fine if all you needed to store were a few five or ten digit numbers at a time, but instruction sets would require thousands of characters.  By the end of World War II the early computer pioneers of both Great Britain and the United States began tackling this problem independently.

The brilliant British mathematician Alan Turing, who has already been mentioned several times in this blog for both his code breaking and early chess programming feats, first articulated the stored program concept.  In April 1936, Turing completed a paper entitled “On Computable Numbers, with an Application to the Entscheidungsproblem” as a response to a lecture by Max Newman he attended at Cambridge in 1935.  In a time when the central computing paradigm revolved around analog computers tailored to specific problems, Turing envisioned a device called the Universal Turing Machine consisting of a scanner reading an endless roll of paper tape. The tape would be divided into individual squares that could either be blank or contain a symbol.  By reading these symbols based on a simple set of hardwired instructions and following any coded instructions conveyed by the symbols themselves, the machine would be able to carry out any calculation possible by a human computer, output the results, and even incorporate those results into a new set of calculations.  This concept of a machine reacting to data in memory that could consist of both instructions and numbers to be manipulated encapsulates the basic operation of a stored program computer.

Turing was unable to act on his theoretical machine with the technology available to him at the time, but when he first saw the Colossus computer in operation at Bletchley Park, he realized that electronics would make such a device possible.  In 1945, Turing moved from Bletchley Park to the National Physical Laboratory (NPL), where late in the year he outlined the first relatively complete design for a stored-program computer.  Called the Automatic Computing Engine (ACE), the computer defined by Turing was ambitious for its time, leading others at the NPL to fear it could not actually be built.  The organization therefore commissioned a smaller test model instead called the Pilot ACE.  Ultimately, Turing left the NPL in frustration over the slow progress of building the Pilot ACE, which was not completed until 1950 and was therefore preceded by several other stored program computers.  As a result, Turing, despite being the first to articulate the stored program concept, exerted little influence over how the stored program concept was implemented.

One of the first people to whom Turing gave a copy of his landmark 1936 paper was its principle inspiration, Max Newman.  Upon reading it, Newman became interested in building a Universal Turing Machine himself.  Indeed, he actually tried to interest Tommy Flowers in the paper while he was building his Colossi for the Newmanry at Bletchley Park, but Flowers was an engineer, not a mathematician or logician, and by his own admission did not really understand Turing’s theories.  As early as 1944, however, Newman himself was expressing his enthusiasm about taking what had been learned about electronics during the war and establishing a project to build a Universal Turing Machine at the war’s conclusion.

In September 1945, Newman took the Fielden Chair of Mathematics at Manchester University and soon after applied for a grant from the Royal Society to establish the Computing Machine Laboratory at the university.  After the grant was approved in May 1946, Newman had portions of the dismantled Colossi shipped to Manchester for reference and began assembling a team to tackle a stored-program computer project.  Perhaps the most important members of the team were electrical engineers Freddie Williams and Tom Kilburn.  While working on radar during the war, the duo developed a storage method in which a cathode ray tube can “remember” a piece of information by virtue of firing an electron “dot” onto the surface of the tube, thus creating a persistent charge well.  By placing a metal plate against the surface of the tube, this data can be “read” in the form of a voltage pulse transferred to the plate whenever a charge well is created or eliminated by drawing or erasing a dot.  Originally developed to eliminate stationary background objects from a radar display, a Williams tube could also serve as computer memory and store 1,024 characters.  As any particular dot on the tube could be read at any given time, the Williams tube was an early form of random access memory (RAM)

In June 1948, Williams and Kilburn completed the Manchester Small Scale Experimental Machine (SSEM), which was specifically built to test the viability of the Williams Tube as a computer memory device.  While this computer contained only 550 tubes and was therefore not practical for actual computing projects, the SSEM was the first device in the world with all the characteristics of a stored program computer and proved the viability of Williams Tube memory.  Building on this work, the team completed the Manchester Mark 1 computer in October 1949, which contained 4,050 tubes and used more reliable custom-built CRTs from industrial conglomerate the General Electric Company (GEC) to increase the reliability of the memory.

978

John von Neumann stands next to the IAS Machine, which he developed based on his consulting work on the Electronic Discrete Variable Automatic Computer (EDVAC), the first stored-program computer in the United States

Meanwhile, at the Moore School Eckert and Mauchly were already beginning to ponder building a computer superior to the ENIAC by the middle of 1944.  The duo felt the most serious limitation of the computer was its paltry storage, and like Newman in England, they turned to radar technology for a solution.  Before joining the ENIAC project, Eckert had devised the first practical method of eliminating stationary objects from a display called delay line memory.  Basically, rather than displaying the result of a single pulse on the screen, the radar would compare two pulses, one of which was delayed by passing it through a column of mercury, allowing both pulses to arrive at the same time, with the radar screen displaying only those objects that were in different locations between the two pulses.  Eckert realized that using additional electronic components to keep the delayed pulse trapped in the mercury would allow it to function as a form of computer memory.

The effort to create a better computer received a boost when Herman Goldstine had a chance encounter with physicist John von Neumann at the Aberdeen railroad station.  A brilliant Hungarian emigre teaching at Princeton, von Neumann was consulting on several government war programs, including the Manhattan Project, but had not been aware of the ENIAC.  When Goldstine started discussing the computer on the station platform, von Neumann took an immediate interest and asked for access to the project.  Impressed by what he saw, von Neumann not only used his influence to help gain the BRL’s approval for Project PY to create the improved machine, he also held several meetings with Eckert and Mauchly in which he helped define the basic design of the computer.

The extent of von Neumann’s contribution to the Electronic Discrete Variable Automatic Computer (EDVAC) remains controversial.  Because the eminent scientist penned the first published general overview of the computer in May 1945, entitled “First Draft of a Report on the EDVAC,” the stored program concept articulated therein came to be called the “von Neumann architecture.”  In truth, the realization that the increased memory provided by mercury delay lines would allow both instructions and numbers to be stored in memory occurred during meetings between Eckert, Mauchly, and von Neumann, and his contributions were probably not definitive.  Von Neumann did, however, play a critical role in defining the five basic elements of the computer — the input, the output, the control unit, the arithmetic unit, and the memory — which remain the basic building blocks of the modern computer.  It is also through von Neumann, who was keenly interested in the human brain, that the term “memory” entered common use in a computing context.  Previously, everyone from Babbage forward had used the term “storage” instead.

The EDVAC project commenced in April 1946, but the departure of Eckert and Mauchly with most of their senior engineers soon after disrupted the project, so the computer was not completed until August 1949 and only became fully operational in 1951 after several problems with the initial design were solved.  It contained 6,000 vacuum tubes, 12,000 diodes, and two sets of 64 mercury delay lines capable of storing eight characters per line, for a total storage capacity of 1,024 characters.  Like the ENIAC, EDVAC cost roughly $500,000 to build.

cambridge

The Electronic Delay Storage Automatic Calculator (EDSAC)

Because of the disruptions caused by Eckert and Mauchley’s departures, the EDVAC was not actually the first completed stored program computer conforming to von Neumann’s report.  In May 1946, computing entrepreneur L.J. Comrie visited the Moore School to view the ENIAC and came away with a copy of the von Neumann EDVAC report.  Upon his return to England, he brought the report to physicist Maurice Wilkes, who had established a computing laboratory at Cambridge in 1937, but had made little progress in computing before World War II.  Wilkes devoured the report in an evening and then paid his own way to the United States so he could attend the Moore School lectures.   Although he arrived late and only managed to attend the final two weeks of the course, Wilkes was inspired to initiate his own stored-program computer project at Cambridge, the Electronic Delay Storage Automatic Calculator (EDSAC).  Unlike the competing computer projects at the NPL and Manchester University, Wilkes decided that completing a computer was more important than advancing computer technology and therefore decided to create a machine of only modest capability and to use delay line memory rather than the newer Williams tubes developed at Manchester.  While this resulted in a less powerful computer than some of its contemporaries, it did allow the EDSAC to become the first practical stored-program computer when it was completed in May 1949.

Meanwhile, after concluding his consulting work at the Moore School, John von Neumann established his own stored-program computer project in late 1945 at the Institute of Advanced Study (IAS) at Princeton University.  Primarily designed by Julian Bigelow, the IAS Machine employed 3,000 vacuum tubes and could hold 4,096 40-bit words in its Williams Tube memory.  Although not completed until June 1952, the functional plan of the computer was published in the late 1940s and widely disseminated.  As a result, the IAS Machine became the template for many of the scientific computers built in the 1950s, including the MANIAC, JOHNNIAC, MIDAC, and MIDSAC machines that hosted some of the earliest computer games.

With the Moore lectures about the ENIAC and the publication of the IAS specifications helping to spread interest in electronic computers across the developed world and the EDSAC computer demonstrating that crafting a reliable stored program computer was possible, the stage was now set for the computer to spread beyond a few research laboratories at prestigious universities and become a viable commercial product.

Historical Interlude: The Birth of the Computer Part 1, the Mechanical Age

Before continuing the history of video gaming with the activities of the Tech Model Railroad Club and the creation of the first truly landmark computer game, Spacewar!, it is time to pause and present the first of what I referred to in my introductory post as “historical interludes.”  In order to understand why the video game finally began to spread in the 1960s, it is important to understand the evolution of computer technology and the spread of computing resources.  As we shall see, the giant mainframes of the 1940s and 1950s were neither particularly interactive nor particularly accessible outside of a small elite, which generally prevented the creation of programs that provided feedback quickly and seamlessly enough to create an engaging play experience while also generally discouraging projects not intended to aid serious research or corporate data processing.  By the time work on Spacewar! began in 1961, however, it was possible to occasionally divert computers away from more scholarly pursuits and design a program interesting enough to hold the attention of players for hours at a time.  The next four posts will describe how computing technology reached that point.

Note: Unlike my regular posts, historical interlude posts will focus more on summarizing events and less on critiquing sources or stating exactly where every last fact came from.  They are meant to provide context for developments in video game history, and the information within them will usually be drawn from a small number of secondary sources and not be researched as thoroughly as the video game history posts.  Much of the material in this post is drawn from Computer: A History of the Information Machine by Martin Campbell-Kelly and William Aspray, The Maverick and His Machine: Thomas Watson, Sr. and the Making of IBM by Kevin Maney, and The Innovaters: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution by Walter Isaacson.

Defining the Computer

766px-Human_computers_-_Dryden

Human computers working at the NACA High Speed Flight Station in 1949

Before electronics, before calculating machines, even before the Industrial Revolution there were computers, but the term did not mean the same thing it does today.  Before World War II and the emergence of the first electronic digital computers, a computer was a person who performed calculations, generally for a specialized purpose.  As we shall see, most of the early computers were created specifically to perform calculations, so as they grew to function with less need for human intervention, they naturally came to be called “computers” themselves after the profession they quickly replaced.

The computer profession originated after the development of the first mathematical tables in the 16th and 17th centuries such as the logarithmic tables designed to perform complex mathematical operations solely through addition and subtraction and the trigonometric tables designed to simplify the calculation of angles for fields like surveying and astronomy.  Computers were the people who would perform the calculations necessary to produce these tables.  The first permanent table-making project was established in 1766 by Nevil Maskelyne to produce navigational tables that were updated and published annually in the Nautical Almanac, which is still issued today.

Maskelyne relied on freelance computers to perform his calculations, but with the dawning of the Industrial Revolution, a French mathematician named Gaspard de Prony established what was essentially a computing factory in 1791 modeled after the division of labor principles espoused by Adam Smith in the Wealth of Nations to compile accurate logarithmic and trigonometric tables to aid in performing a new survey of the entirety of France as part of a project to reform the property tax system.  De Prony relied on a small number of skilled mathematicians to define the mathematical formulas and a group of middle managers to organize the tables, so his computers needed only a knowledge of basic addition and subtraction to do their work, reducing the computer to an unskilled laborer.  As the Industrial Revolution progressed, unskilled workers in most fields moved from using simple tools to mechanical factory machinery to do their work, so it comes as no surprise that one enterprising individual would attempt to bring a mechanical tool to computing as well.

Charles Babbage and the Analytical Engine

charles_babbage

Charles Babbage, creator of the first computer design

Charles Babbage was born in 1791 in London.  The son of a banker, Babbage was a generally indifferent student who bounced between several academies and private tutors, but did gain a love of mathematics at an early age and attained sufficient marks to enter Trinity College, Cambridge, in 1810.  While Cambridge was the leading mathematics institution in England, the country as a whole had fallen behind the Continent in sophistication, and Babbage soon came to realize he knew more about math than his instructors.  In an attempt to rectify this situation, Babbage and a group of friends established the Analytical Society to reform the study of mathematics at the university.

After leaving Cambridge in 1814 with a degree in mathematics from Peterhouse, Babbage settled in London, where he quickly gained a reputation as an eminent mathematical philosopher but had difficulty finding steady employment.  He also made several trips to France beginning in 1819, which is where he learned of De Prony’s computer factory.  In 1820, he joined with John Herschel to establish the Astronomical Society and took work supervising the creation of star tables.  Frustrated by the tedious nature of fact-checking the calculations of the computers and preparing the tables for printing, Babbage decided to create a machine that would automate the task.

The Difference Engine would consist of columns of wheels and gears each of which represented a single decimal place.  Once the initial values were set for each column — which would be determined by setting a polynomial equation in column one and then using a series of derivatives to establish the value of the other columns — the machine would use Newton’s method of divided differences (hence its name) to perform addition and subtraction functions automatically, complete the tables, and then send them to a printing device.  Babbage presented his proposed machine to the Royal Society in 1822 and won government funding the next year by arguing that a maritime industrial nation required the most accurate navigational tables possible and that the Difference Engine would be both cheaper to operate and more accurate than an army of human computers.

The initial grant of £1,500 quickly proved insufficient for the task of creating the machine, however, which was at the very cutting edge of machine tool technology and therefore extremely difficult to fashion components for.   The government continued to fund the project for over a decade, however, ultimately providing £17,000.  By 1833, Babbage was able to construct a miniature version of the Difference Engine that lacked sufficient capacity to actually create tables but did prove the feasibility of the project.  The next year, however, he unwittingly sabotaged himself by proposing an even more grand device to the government, the Analytical Engine, thus undermining the government’s faith in Babbage’s ability to complete the original project and causing it to withdraw funding and support.  A fully working Difference Engine to Babbage’s specification would not be built until the late 1980s, by which time it was a historical curiosity rather than a useful machine.  In the meantime, Babbage turned his attention to the Analytical Engine, the first theorized device with the capabilities of a modern computer.

10303265

A portion of Charles Babbage’s Analytical Engine, which remained unfinished at his death

The Difference Engine was merely a calculating machine that performed addition and subtraction, but the proposed Analytical Engine was a different beast.  Equipped with an arithmetical unit called the “mill” that exhibited many of the features of a modern central-processing unit (CPU), the machine would be capable of performing all four basic arithmetic operations.  It would also possess a memory, able to store 1,000 numbers of up to 40 digits each.  Most importantly, it would be program controlled, able to perform a wide variety of tasks based on instructions inputted into the machine.  These programs would be entered using punched cards, a recording medium first developed in 1725 by Basile Bouchon and Jean-Baptiste Falcon to automate textile looms that was greatly improved and popularized by Joseph Marie Jacquard in 1801 for the loom that bears his name.  Results could be outputted to a printer or a curve plotter.  By employing separate memory and computing elements and establishing a method of program control, Babbage outlined the first machine to include all the basic hallmarks of the modern computer.

Babbage sketched out the design of his Analytical Engine between 1834 and 1846.  He then halted work on the project for a decade before returning to the concept in 1856 and continuing to tinker with it right up until his death in 1871.  Unlike with the Difference Engine, however, he was never successful in securing funding from a British Government that remained unconvinced of the device’s utility — as well as unimpressed by Babbage’s inability to complete the first project it had commissioned from him — and thus failed to build a complete working unit.  His project did attract attention in certain circles, however.  Luigi Manabrea, a personal friend and mathematician who later became Prime Minister of Italy, invited Babbage to give a presentation on his Analytical Engine at the University of Turin in 1842 and subsequently published a transcription of the lecture in French.  This account was translated into English over a nine month period in 1842-43 by another friend of Babbage, Ada Lovelace, the daughter of the celebrated poet Lord Byron.

Ada Lovelace has been a controversial figure in computer history circles.  Born in 1815, she never knew her celebrated father, whom her mother fled shortly after Ada’s birth.  She possessed what appears to have been a decent mathematical mind, but suffered from mental instability and delusions of grandeur that caused her to perceive greater abilities than she actually possessed.  She became a friend and student of noted mathematician Mary Somerville, who was also a friend of Babbage.  It was through this connection that she began attending Babbage’s regular Saturday evening salons in 1834 and came to know the man.  She tried unsuccessfully to convince him to tutor her, but they remained friends and he was happy to show off his machines to her.  Lovelace became a fervent champion of the Analytical Engine and attempted to convince Babbage to make her his partner and publicist for the machine.  It was in this context that she not only took on the translation of the Turin lecture in 1842, but at Babbage’s suggestion also decided to appended her own description of how the Analytical Engine differed from the earlier Difference Engine alongside some sample calculations using the machine.

In a section entitled “Notes by the Translator,” which ended up being longer than the translation itself, Lovelace articulated several important general principles of computing, including the recognition that a computer could be programmed and reprogrammed to take on a variety of different tasks and that it could be set to tasks beyond basic math through the use of symbolic logic.  She also outlined a basic structure for programming on the Analytical Engine, becoming the first person to articulate common program elements such as recursive loops and subroutines.  Finally, she included a sample program to calculate a set of Bernoulli numbers using the Analytical Engine.  This last feat has led some people to label Lovelace the first computer programmer, though in truth it appears Babbage created most of this program himself.  Conversely, some people dismiss her contributions entirely, arguing that she was being fed all of her ideas directly by Babbage and had little personal understanding of how his machine worked.  The truth is probably somewhere in the middle.  While calling her the first programmer is probably too much of a stretch, as Babbage had already devised several potential programs himself by that point and contributed significantly to Lovelace’s as well, she still deserves recognition for being the first person to articulate several important elements of computer program structure.  Sadly, she had no chance to make any further mark on computer history, succumbing to uterine cancer in 1852 at the age of thirty-six.

Towards the Modern Office

cb000184_1907_Office_with_Burroughs_Model_6_OM

An Office in the B-Logo Business Systems Department in 1907, showcasing some of the mechanical equipment revolutionizing clerical work in the period.

Ultimately, the Analytical Engine proved too ambitious, and the ideas articulated by Babbage would have to wait for the dawn of the electronics era to become practical.  In the meantime, however, the Industrial Revolution resulted in great advances in office automation that would birth some of the most important companies of the early computer age.  Unlike the human computer industry and the innovative ideas of Babbage, however, the majority of these advances came not from Europe, but from the United States.

Several explanations have been advanced to explain why the US became the leader in office automation.  Certainly, the country industrialized later than the European powers, meaning businessmen were not burdened with outmoded theories and traditions that hindered innovations in the Old World.  Furthermore, the country had a long history of interest in manufacturing efficiency, dating back as far as Eli Whitney and his concept of using interchangeable parts in firearms in 1801 (Whitney’s role in the creation of interchangeable parts is usually exaggerated, as he was not the first person to propose the method and was never actually able to implement it himself, but he was responsible for introducing the concept to the US Congress and therefore still deserves some credit for its subsequent adoption in the United States).  By the 1880s, this fascination with efficiency had evolved into the “scientific management” principles of Frederick Taylor that aimed to identify best practices through rational, empirical study and employ standardization and training to eliminate waste and inefficiency on the production line.  Before long, these ideals had penetrated the domain of the white-collar worker through the concept of “office rationalization,” in which managers introduced new technologies and systems to maximize productivity in that setting as well.

The first major advance in the drive for office automation was the invention of a practical typewriter.  While several inventors created typing machines in the early nineteenth century, none of these designs gained any traction in the marketplace because using them was slower than writing out a document by hand.  In 1867, however, a retired newspaper editor named Christopher Latham Sholes was inspired by an article in Scientific American describing a mechanical typing device to create one of his own.  By the next year Sholes, with the help of amateur mechanic Carlos Glidden and printer Samuel Soule, had created a prototype for a typing machine using a keyboard and type-basket design that finally allowed typing at a decent speed.  After Soule left the project, Sholes sent typewritten notes to several financiers in an attempt to raise capital to refine the device and prepare for mass production.  A Pennsylvania businessman named James Densmore answered the call and provided the funding necessary to make important improvements such as replacing a frame to hold the paper with a rotating drum and changing the layout of the keyboard to the familiar QWERTY orientation — still used on computer keyboards to this day — to cut down on jamming by spacing out commonly used letters in the typing basket.

After several failed attempts to mass produce the typewriter through smaller companies in the early 1870s, Densmore was able to attract the interest of Philio Remington of the small-arms manufacturer E. Remington & Sons, which had been branching out into other fields such as sewing machines and fire engines in the aftermath of the U.S. Civil War.  First introduced by Remington in 1874, the typewriter sold slowly at first, but as office rationalization took hold in the 1880s, businesses started flocking to the machine.  By 1890 Remington had a virtual monopoly on the new industry and was producing 20,000 machines a year.  In addition to establishing the typewriter in the office, Remington also pioneered the idea of providing after-market service for office products, opening branch offices in major cities where people could not only buy typewriters, but also bring them in for repairs.

With typed loose-leaf pages replacing the traditional “letter book” for office correspondence, companies soon found it necessary to adopt new methods for storing and retrieving documents.  This led to the development of vertical filing using hanging folders stored in upright cabinets, which was first publicly demonstrated by Melville Dewey at the Chicago World’s Fair in 1893.  While vertical filing proved superior to the boxes and drawers previously employed in the workplace, however, it proved woefully inefficient once companies evolved from tracking hundreds of records to tens of thousands.  This time the solution came from James Rand, Sr., a clerk from Tonawanda, New York, who patented a visible index system in which colored signal strips and tabs would allow specific file folders to be found quickly and easily.  Based on this invention, the clerk established the Rand Ledger Company in 1898.  His son, James Rand, Jr., joined the business in 1908 and then split off from his father in 1915 after a dispute over advertising spending to market his own record retrieval system based around index cards called the Kardex System.  As the elder Rand neared retirement a decade later, his wife orchestrated a reconciliation between him and his son, and their companies merged to form the Rand Kardex Company in 1925.  Two years later, Rand Kardex merged with the Remington Typewriter Company to form Remington Rand,  which became the largest business machine company in the world.

burroughs

A Burroughs “adder-lister,” one of the first commercially successful mechanical calculators

A second important invention of the late nineteenth century was the first practical calculator.  Mechanical adding machines had existed as far back as the 17th century when Blaise Pascal completed his Pascaline in 1645 and Gottfriend Liebnitz invented the first calculator capable of performing all four basic functions, the Stepped Reckoner, in 1692, but the underlying technology remained fragile and unreliable and therefore unsuited to regular use despite continued refinements over the next century.  In 1820, the calculator was commercialized for the first time by Thomas de Colmar, but production of his Arithmometer lasted only until 1822.  After making several changes, Thomas began offering his machine to the public again in 1851, but while the Arithmometer gained a reputation for both sturdiness and accuracy, production never exceeded a few dozen a year over the next three decades as the calculator remained too slow and impractical for use in a business setting.

The main speed bottleneck of the early adding machines was that they all required the setting of dials and levers to use, making them far more cumbersome for bookkeepers than just doing the sums by hand.  The man who first solved this problem was Dorr Felt, a Chicago machinist who replaced the dials with keys similar to those found on a typewriter.  Felt’s Comptometer, completed in 1885, arranged keys labelled 0 to 9 across ten columns that each corresponded to a single digit of a number, allowing figures to be entered rapidly with just one hand.  In 1887, Felt formed the Felt & Tarrant Manufacturing Company with a local manufacturer named Robert Tarrant to mass produce the Comptometer, and by 1900 they were selling over a thousand a year.

While Felt remained important in the calculator business throughout the early twentieth century, he was ultimately eclipsed by another inventor.  William S. Burroughs, the son of a St. Louis mechanic, was employed as a clerk at a bank but suffered from health problems brought on by spending hours hunched over columns adding figures.  Like Felt, he decided to create a mechanical adding machine using keys to improve this process, but he also added another key advance to his “adder-lister,” the ability to print the numbers as they were entered so there would be a permanent record of every financial transaction.  In 1886, Burroughs established the American Arithmometer Company to market his adding machine, which was specifically targeted at banks and clearing houses and was selling at a rate of several hundred a year by 1895.  Burroughs died in 1898, but the company lived on and relocated to Detroit in 1904 after it outgrew its premises in St. Louis, changing its name to the Burroughs Adding Machine Company in honor of its founder.  At the time of the move, Burroughs was selling 4,500 machines a year.  Just four years later, that number had risen to 13,000.

John H. Patterson

John H. Patterson, founder of the National Cash Register Company (NCR)

The adding machine was one of two important money management devices invented in this period, with the other being the mechanical cash register.  This device was invented in 1879 by James Ritty, a Dayton saloon owner who feared his staff was stealing from him, and constructed by his brother, John.  Inspired by a tool that counted the revolutions of the propeller on a steamship, “Ritty’s Incorruptible Cashier” required the operator to enter each transaction using a keypad, displayed each total entered for all to see, and printed the results on a roll of paper, allowing the owner to compare the cash taken in to the recorded amounts.  Ritty attempted to interest other business owners in his machine, but proved unsuccessful and ultimately sold the business to Jacob Eckert of Cincinnati in 1881.  Eckert added a cash drawer to the machine and established the National Manufacturing Company, but he was barely more successful than the Rittys.  Therefore, in 1884 he sold out to John Patterson, who established the National Cash Register Company (NCR).

John Henry Patterson was born on a farm outside Dayton, Ohio, and entered the coal trade after graduating from Dartmouth College.  While serving as the general manager of the Southern Coal and Iron Company, Patterson was tasked with running the company store and became one of Ritty’s earliest cash register customers.  After being outmaneuvered in the coal trade, Patterson sold his business interests and used the proceeds to buy NCR.  A natural salesman, Patterson created and/or popularized nearly every important modern sales practice while running NCR.  He established sales territories and quotas for his salesmen, paid them a generous commission, and rewarded those who met their quotas with an annual sales convention.  He also instituted formal sales training and produced sales literature that included sample scripts, creating the first known canned sales pitch.  Like Remington, he established a network of dealerships that provided after market services to build customer loyalty, but he also advertised through direct mailings, another unusual practice.  Understanding that NCR could only stay on top of the business by continuing to innovate, Patterson also established an “innovations department” in 1888, one of the earliest permanent corporate research & development organizations in the world.  In an era when factory work was mostly still done in crowded “sweatshops,” Patterson constructed a glass-walled factory that let in ample light set amid beautifully landscaped grounds.

While Patterson seemed to genuinely care for the welfare of his workers, however, he also had a strong desire to control every aspect of their lives.  He manipulated subordinates constantly, hired and fired individuals for unfathomable reasons, instituted a strict physical fitness regimen that all employees were expected to follow, and established rules of conduct for everything from tipping waiters to buying neckties.  For all his faults, however, his innovative sales techniques created a juggernaut.  By 1900, the company was selling 25,000 cash registers a year, and by 1910 annual sales had risen to 100,000.  By 1928, six years after Patterson’s death, NCR was the second largest office-machine supplier in the world with annual sales of $50 million, just behind Remington Rand at $60 million and comfortably ahead of number three Burroughs at $32 million.  All three companies were well ahead of the number four company, a small firm called International Business Machines, or IBM.

Computing, Tabulating, and Recording

IBM, which eventually rose to dominance in the office machine and data processing industries, cannot be traced back to a single origin, for it began as a holding company that brought together several firms specializing in measuring and processing information.  There were three key people responsible for shaping the company in its early years: Herman Hollerith, Charles Flint, and Tom Watson, Sr.

416px-Hollerith

Herman Hollerith, whose tabulating machine laid the groundwork for the company that became IBM

Born in Buffalo, New York, in 1860, Herman Hollerith pursued an education as a mining engineer, culminating in a Ph.D from Columbia University in 1890.  One of Hollerith’s professors at Columbia also served as an adviser to the Bureau of the Census in Washington, introducing Hollerith to the largest data processing organization in the United States.  At the time, the Census Bureau was in crisis as traditional methods of processing census forms failed to keep pace with a growing population.  The 1880 census, processed entirely by hand using tally sheets, took the bureau seven years to complete.  With the population of the country continuing to expand rapidly, the 1890 census appeared poised to take even longer.  To attack this problem, the new superintendent of the census, Robert Porter, held a competition to find a faster and more efficient way to count the U.S. population.

Three finalists demonstrated solutions for Porter in 1889.  Two of them created systems using colored ink or cards to allow data to be sorted more efficiently, but these were still manual systems.  Hollerith on the other hand, inspired by the ticket punches used by train conductors, developed a system in which the statistical information was recorded on punched cards that were quickly tallied by a tabulating machine of his own design.  Cards were placed in this machine one at a time and pressed with an apparatus containing 288 retractable pins.  Any pin that encountered a hole in the card would complete an electrical circuit and advance one of forty tallies.  Using Hollerith’s machines, the Census Bureau was able to complete its work in just two and a half years.

As the 1890 census began to wind down, Hollerith re-purposed his tabulating system for use by businesses and incorporated the Tabulating Machine Company in December 1896.  He remained focused on the census, however, until President McKinley’s assassination in 1901 resulted in the appointment of a new superintendent that chose to go with a different company for 1910.  In the meantime, Hollerith refined his system by implementing a three-machine setup consisting of a keypunch to put the holes in the cards, a tabulator to tally figures, and a sorting machine to place the cards in sequence.  By 1911, Hollerith had roughly one hundred customers and the business was continuing to expand, but his health was failing, leading him to entertain an offer to sell from an influential financier named Charles Flint.

Charles_Ranlett_Flint

Charles Rantlett Flint, the man who forged IBM

Charles Rantlett Flint was a self-made man born into a family of shipbuilders that started his first business at 18 on the docks of his hometown of Thomaston, Maine.  From there, he secured a job with a trader named William Grace by offering to work for free.  In 1872, Grace made Flint a partner in his new W.R. Grace & Co. shipping and trading firm, which still exists today as a chemical and construction materials conglomerate.  During this period, Flint acted as a commission agent in South America dealing in both arms and raw materials.  He also became keenly interested in new technologies such as the automobile, light bulb, and airplane.

In 1892, Flint leveraged his international trading contacts to pull together a number of rubber exporters into a trust called U.S. Rubber.  This began a period of intense monopoly building by Flint across a number of industries.  By 1901, Flint’s growing roster of trusts included the International Time Recording Company (ITR) of Endicott, New York, based around the recently invented time clock that allowed employers to easily track the hours worked by their employees, and the Computing Scale Company of America of Dayton, Ohio, based around scales that would both weigh items by the pound and compute their total cost.  While ITR proved modestly successful, however, the Computing Scale Company ended an abject failure.  In an attempt to salvage his poorly performing concern, Flint decided to define a new larger market of information recording machines for businesses and merge ITR and Computing Scale under the umbrella of a single holding company.  Feeling Hollerith’s company fit well into this scheme, Flint purchased it as well in 1911 and folded the three companies into the new Computing-Tabulating-Recording Company (C-T-R).  The holding company approach did not work, however, as C-T-R was an unwieldy organization consisting of three subsidiaries spread across five cities with managers that ignored each other at best and actively plotted against each other at worst.  Furthermore, the company was saddled with a large debt and its component parts could not leverage their positions in a trust to create superior integration or economies of scale because their products and customers were too different.  By 1914, C-T-R was worth only $3 million and carried a debt of $6.5 million.  Flint’s experiment had clearly failed, so he brought in a new general manager to turn the company around.  That man was Thomas Watson, Sr.

thomas_watson

Thomas Watson, Sr., the man who built IBM into a corporate giant

By the time Flint hired Watson for C-T-R, he already had a reputation as a stellar salesman, but was also tainted by a court case brought over monopolistic practices.  Born on a farm in south central New York State, Watson tried his hand as both a bookkeeper and a salesman with various outfits, but had trouble holding down steady employment.  After his latest venture failed in 1896, a butcher’s shop in Buffalo, Watson trudged down to the local NCR office to transfer the installment payments on the store’s cash register to the new owner.  While there, he struck up a conversation with a salesman named John Range and kept pestering him periodically until Range finally offered him a job.  Within nine months, Watson went from sales apprentice to full sales agent as he finally seemed to find his calling.  Four years later, he was transferred to the struggling NCR branch in Rochester, New York, which he managed to turn around.  This brought him to the attention of John Patterson in Dayton, who tapped Watson for a special assignment.

By 1903, when Patterson summoned Watson, NCR was experiencing fierce competition from a growing second-hand cash register market.  NCR cash registers were both durable and long-lasting, so enterprising businessmen had begun buying up used cash registers from stores that were upgrading or going out of business and then undercutting NCR’s prices on new machines.  For the controlling monopolist Patterson, this was unacceptable.  His solution was to create his own used cash register business that would buy old machines for higher prices than other outlets and sell them cheaper, making up the lost profits through funding directly from NCR.  Once the competition had been driven out of business, prices could be raised and the business would start turning a profit.  Patterson tapped Watson to control this business.  For legal reasons, Patterson kept the connection between NCR and the new Watson business a secret.

Between 1903 and 1908, Watson slowly expanded his used cash register business across the country, creating an excellent new profit-center for NCR.  His reward was a posting back at headquarters in Dayton as an assistant sales manager, where he soon became Patterson’s protégé and absorbed his innovative sales techniques.  By 1910, Watson had been promoted to sales manager, where his personable and less-controlling management style created a welcome contrast to Patterson and encouraged flexibility and creativity among the 900-strong NCR sales force, helping to double the company’s 1909 sales within two years.

As quickly as Watson rose at NCR, however, he fell even faster.  In 1912 the Taft administration, amid a general crusade against corporate trusts, brought criminal charges against Patterson, Watson, and other high-ranking NCR executives for violations of the Sherman Anti-Trust Act.  At the end of a three-month trial, Watson was found guilty along with Patterson and all but one of their co-defendants on February 13, 1913 and now faced the prospect of jail time.  Worse, the ordeal appears to have soured the ever-changeable Patterson on the executives indicted with him, as they were all chased out of the company within a year.  Watson himself departed NCR in November 1913 after 17 years of service.  Some accounts state that Watson was fired, but it appears that the separation was more by mutual agreement.  Either way, it was a humbled and disgraced Watson that Charles Flint tapped to save C-T-R in early 1914.  Things began looking up the next year, however, when an appeal resulted in an order for a new trial.  All the defendants save Watson settled with the government, which decided pursuing Watson alone was not worth the effort.  Thus cleared of all wrongdoing, Watson was elevated to the presidency of C-T-R.

Watson saved and reinvented C-T-R through a combination of Patterson’s techniques and his own charisma and personality.  He reinvigorated the sales force through quotas, generous commissions, and conventions much like Patterson.  A lover of the finer things in life, he insisted that C-T-R staff always be impeccably dressed and polite, shaping the popular image of the blue-suited IBM sales person that would last for decades.  He changed the company culture by emphasizing the importance of every individual in the corporation and building a sense of company pride and loyalty.  Finally, he was fortunate to take over at a time when the outbreak of World War I and a booming U.S. economy led to increased demand for tabulating machines both from businesses and the U.S. government.  Between 1914 and 1917, revenues doubled from $4.2 million to $8.3 million, and by 1920 they had reached $14 million.

What really set IBM apart, however, was the R&D operation Watson established based on the model of NCR’s innovations department.  At the time Watson arrived, C-T-R remained the leading seller of tabulating machines, but the competition was rapidly gaining market share on the back of superior products.  Hollerith, who remained as a consultant to C-T-R after Flint bought his company, showed little interest in developing new products, causing the company’s technology to fall further and further behind.  The company’s only other senior technical employee, Eugene Ford, occasionally came up with improvements, but he could not actually put them into practice without the approval of Hollerith, which was rarely forthcoming.  Watson moved Ford into a New York loft and ordered him to begin hiring additional engineers to develop new products.

Ford’s first hire, Clair Lake, developed the company’s first printing tabulator in the early 1920s, which gave the company a machine that could rival the competition in both technology and user friendliness.  Another early hire, Fred Carroll from NCR, developed the Carroll Press that allowed C-T-R to cheaply mass produce the punched cards used in the tabulating machines and therefore enjoy a huge profit margin on the product.  In the late 1920s, Lake created a new patentable punched-card design that would only work in IBM machines, which locked-in customers and made them unlikely to switch to a competing company and have to redo millions of cards.  Perhaps the most important hire was James Bryce, who joined the company in 1917, rose to chief engineer in 1922, and ended up with over four hundred patents to his name.

After a small hiccup in 1921-22 as the U.S. endured a small recession, C-T-R, which Watson renamed International Business Machines (IBM) in 1924, experienced rapid growth for the rest of the decade, reaching $20 million in revenue by 1928.  While this placed IBM behind Remington Rand, NCR, and Burroughs, the talented R&D group and highly effective sales force built by Watson left the company perfectly poised to rise to a dominant position in the 1930s and subsequently conquer the new computer market of the 1950s.