NCR

Historical Interlude: From the Mainframe to the Minicomputer Part 2, IBM and the Seven Dwarfs

The computer began life in the 1940s as a scientific device designed to perform complex calculations and solve difficult equations.  In the 1950s, the United States continued to fund scientific computing projects at government organizations, defense contractors, and universities, many of them based around the IAS architecture derived from the EDVAC and created by John von Neumann’s team at Princeton.  Some of the earliest for-profit computer companies emerged out of this scientific work such as the previously discussed Engineering Research Associates, the Hawthorne, California-based Computer Research Corporation, which spun out of a Northrup Aircraft project to build a computer for the Air Force in 1952, and the Pasadena-based ElectroData Corporation, which spun out of the Consolidated Engineering Corporation that same year.  All of these companies remained fairly small and did not sell many computers.

Instead, it was Remington Rand that identified the future path of computing when it launched the UNIVAC I, which was adopted by businesses to perform data processing.  Once corporate America understood the computer to be a capable business machine and not just an expensive calculator, a wide array of office equipment and electronics companies entered the computer industry in the mid 1950s, often buying out the pioneering computer startups to gain a foothold.  Remington Rand dominated this market at first, but as discussed previously, IBM soon vaulted ahead as it acquired computer design and manufacturing expertise participating in the SAGE project and unleashed its world-class sales and service organizations.  Remington Rand attempted to compensate by merging with Sperry Gyroscope, which had both a strong relationship with the military and a more robust sales force, to form Sperry Rand in 1955, but the company never seriously challenged IBM again.

While IBM maintained its lead in the computer industry, however, by the beginning of the 1960s the company faced threats to its dominance at both the low end and the high end of the market from innovative machines based around new technologies like the transistor.  Fearing these new challengers could significantly damage IBM, Tom Watson Jr. decided to bet the company on an expensive and technically complex project to offer a complete line of compatible computers that could not only be tailored to a customer’s individual’s needs, but could also be easily modified or upgraded as those needs changed over time.  This gamble paid off handsomely, and by 1970 IBM controlled well over seventy percent of the market, with most of the remainder split among a group of competitors dubbed the “seven dwarfs” due to their minuscule individual market shares.  In the process, IBM succeeded in transforming the computer from a luxury item only operated by the largest firms into a necessary business appliance as computers became an integral part of society.

Note: Yet again we have a historical interlude post that summarizes key events outside of the video game industry that nevertheless had a significant impact upon it.  The information in this post is largely drawn from Computer: A History of the Information Machine by Martin Campbell-Kelly and William Aspray, A History of Modern Computing by Paul Ceruzzi, Forbes Greatest Technology Stories: Inspiring Tales of the Entrepreneurs and Inventors Who Revolutionized Modern Business by Jeffrey Young, IBM’s Early Computers by Charles Bashe, Lyle Johnson, John Palmer, and Emerson Pugh. and Building IBM: Shaping an Industry and Its Technology by Emerson Pugh.

IBM Embraces the Transistor

IBM1401_TapeSystem_Mwhite

The IBM 1401, the first mainframe to sell over 10,000 units

Throughout most of its history in computers, IBM has been known more for evolution than revolution.  Rarely first with a new concept, IBM excelled at building designs based around proven technology and then turning its sales force loose to overwhelm the competition.  Occasionally, however, IBM engineers have produced important breakthroughs in computer design.  Perhaps none of these were more significant than the company’s invention of the disk drive.

On the earliest computers, mass data storage was accomplished through two primary methods: magnetic tape or magnetic drums.  Tape could hold a large amount of data for the time, but it could only be read serially, and it was a fragile medium.  Drums were more durable and had the added benefit of being random access — that is any point of data on the drum could be read at any time — but they were low capacity and expensive.  As early as the 1940s, J. Presper Eckert had explored using magnetic disks rather than drums, which would be cheaper and feature a greater storage capacity due to a larger surface area, but there were numerous technical hurdles that needed to be ironed out.  Foremost among these was the technology to read the disks.  A drum memory array used rigid read-write heads that could be readily secured, though at high cost.  A disk system required a more delicate stylus to read the drives, and the constant spinning of the disk created a high risk that the stylus would make contact with and damage it.

The team that finally solved these problems at IBM worked not at the primary R&D labs in Endicott or Poughkeepsie, but rather a relatively new facility in San Jose, California, led by IBM veteran Reynold Johnson that had been established in 1952 as an advanced technologies research center free of the influence of the IBM sales department, which had often shut down projects with no immediate practical use.  One of the lab’s first projects was to improve storage for IBM’s existing tabulating equipment.  This task fell to a team led by Arthur Critchlow, who decided based on customer feedback to develop a new random access solution that would allow IBM’s tabulators and low-end computers to not only be useful for data processing, but also for more complicated jobs like inventory management.  After testing a wide variety of memory solutions, Critchlow’s team settled on the magnetic disk as the only viable solution, partially inspired by a similar project at the National Bureau of Standards on which an article had been published in August 1952.

To solve the stylus problem on the drive, Critchlow’s team attached a compressor to the unit that would pump a thin layer of air between the disk and the head.  Later models would take advantage of a phenomenon known as the “boundry layer” in which the fast motion of the disks would generate the air cushion themselves.  After experimenting with a variety of head types and positions throughout 1953 and 1954, the team was ready to complete a final design.  Announced in 1956 as the Model 305 Disk Storage Unit and later renamed RAMAC (for Random Access Memory Accounting Machine), IBM’s first disk drive consisted of fifty 24-inch diameter aluminum disks rotating at 1200 rpm with a storage capacity of five million characters.  Marketed as an add-on to the IBM 650, RAMAC revolutionized data processing by eliminating the time consuming process of manually sorting information and provided the first compelling reason for small and mid-sized firms to embrace computers and eliminate electro-mechanical tabulating equipment entirely.

IBM_7090_computer

The IBM 7090, the company’s first transistorized computer

In August 1958, IBM introduced its latest scientific computer, the IBM 709, which improved on the functionality of the IBM 704.  The 709 continued to depend on vacuum tubes, however, even as competitors were starting to bring the first transistorized computers to market.  While Tom Watson, Jr. and his director of engineering, Wally McDowell, were both excited by the possibilities of transistors from the moment they first learned about them and as early as 1950 charged Ralph Palmer’s Poughkeepsie laboratory to begin working with the devices, individual project managers continued to have the final authority in choosing what parts to use in their machines, and many of them continued to fall back on the more familiar vacuum tube.  In the end, Tom Watson, Jr. had to issue a company-wide mandate in October 1957 that transistors were to be incorporated into all new projects.  In the face of this resistance, Palmer felt that IBM needed a massive project to push its solid-state designs forward, something akin to what Project SAGE had done for IBM’s efforts with vacuum tubes and core memory.  He therefore teamed with Steve Dunwell, who had spent part of 1953 and 1954 in Washington D.C. assessing government computing requirements, to propose a high-speed computer tailored to the ever-increasing computational needs of the military-industrial complex.  A contract was eventually secured with the National Security Agency, and IBM approved “Project Stretch” in August 1955, which was formally established in January 1956 with Dunwell in charge.

Project Stretch experienced a long, difficult, and not completely successful development cycle, but it did achieve Palmer’s goals of greatly improving IBM’s solid-state capabilities, with particularly important innovations including a much faster core memory and a “drift transistor” that was faster than the surface-barrier transistor used in early solid-state computing projects like the TX-0.  As work on Stretch dragged on, however, these advances were first introduced commercially through another product.  In response to Sputnik, the United States Air Force quickly initiated a new Ballistic Missile Early Warning System (BMEWS) project that, like SAGE, would rely on a series of linked computers.  The Air Force mandated, however, that these computers incorporate transistors, so Palmer offered to build a transistorized version of the 709 to meet the project’s needs.  The resulting IBM 7090 Data Processing System, deployed in November 1959 as IBM’s first transistorized computer, provided a six-fold increase in performance over the 709 at only one-third additional cost.  In 1962,  an upgraded version dubbed the 7094 was released with a price of roughly $2 million.  Both computers were well-received, and IBM sold several hundred of them.

Despite the success of its mainframe computer business, IBM in 1960 still derived the majority of its sales from the traditional punched-card business.  While some larger organizations were drawn to the 702 and 705 business computers, their price kept them out of reach of the majority of IBM’s business customers.  Some of these organizations had embraced the low-cost 650 as a data processing solution, leading to over 800 installations of the computer by 1958, but it was actually more expensive and less reliable than IBM’s mainline 407 electric accounting machine.  The advent of the transistor, however, finally provided the opportunity for IBM to leave its tabulating business behind for good.

The impetus for a stored-program computer that could displace traditional tabulating machines initially came from Europe, where IBM did not sell its successful 407 due to import restrictions and high tooling costs.  In 1952, a competitor called the French Bull Company introduced a new calculating machine, the Bull Gamma 3, that used delay-line memory to provide greater storage capacity at a cheaper price than IBM’s electronic calculators and could be joined with a card reader to create a faster accounting machine than anything IBM offered in the European market.  Therefore, IBM’s French and German subsidiaries began lobbying for a new accounting machine to counter this threat.  This led to the launch of two projects in the mid-1950s: the modular accounting calculator (MAC) development project in Poughkeepsie that birthed the 608 electronic calculator and the expensive and relatively unsuccessful 7070 transistorized computer, and the Worldwide Accounting Machine (WWAM) project run out of France and Germany to create an improved traditional accounting machine for the European market.

While the WWAM project had been initiated in Europe, it was soon reassigned to Endicott when the European divisions proved unable to come up with an accounting machine that could meet IBM’s cost targets.  To solve this problem, Endicott engineer Francis Underwood proposed that a low-cost computer be developed instead.  Management approved this concept in early 1958 under the name SPACE — for Stored Program Accounting and Calculating Equipment — and formally announced the product in October 1959 as the IBM 1401 Data Processing System.  With a rental cost of only $2,500 a month (roughly equivalent to a purchase price of $150,000), the transitorized 1401 proved much faster and more reliable than an IBM 650 at a fraction of the cost and was only slightly more expensive than a mid-range 407 accounting machine setup.  More importantly, it shipped with a new chain printer that could output 600 lines per minute, far more than the 150 lines per minute produced by the 407, which relied on obsolete prewar technology.  First sold in 1960, IBM projected that it would sell roughly 1,000 1401 computers over its entire lifetime, but its combination of power and price proved irresistible, and by the end of 1961 over 2,000 machines had already been installed.  IBM would eventually deploy 12,000 1401 computers before it was officially withdrawn in 1971.  Powered by the success of the 1401, IBM’s computer sales finally equaled the sales of punch card products in 1962 and then quickly eclipsed them.  No computer model had ever approached the success of the 1401 before, and as IBM rode the machine to complete dominance of the mainframe industry in the early 1960s, the powder-blue casing of the machine soon inspired a new nickname for the company: Big Blue.

The Dwarfs

honeywell200

The Honeywell 200, which competed with IBM’s 1401 and threatened to destroy its low-end business

In the wake of Remington Rand’s success with the UNIVAC I, more than a dozen old-line firms flocked to the new market.  Companies like Monroe Calculating, Bendix, Royal, Underwood, and Philco rushed to provide computers to the business community, but one by one they fell by the wayside.  Of these firms, Philco probably stood the best chance of being successful due to its invention of the surface barrier transistor, but while its Transac S-1000 — which began life in 1955 as an NSA project called SOLO to build a transistorized version of the UNIVAC 1103 — and S-2000 computers were both capable machines, the company ultimately decided it could not keep up with the fast pace of technological development and abandoned the market like all the rest.  By 1960, only five established companies and one computer startup joined Sperry Rand in attempting to compete with IBM in the mainframe space.  While none of these firms ever succeeded in stealing much market share from Big Blue, most of them found their own product niches and deployed some capable machines that ultimately forced IBM to rethink some of its core computer strategies.

Of the firms that challenged IBM, electronics giants GE and RCA were the largest, with revenues far exceeding the computer industry market leader, but in a way their size worked against them.  Since neither computers nor office equipment were among either firm’s core competences, nor integral to either firm’s future success, they never fully committed to the business and therefore never experienced real success.  Unsurprisingly, they were the first of the seven dwarfs to finally call it quits, with GE selling off its computer business in 1970 and RCA following suit in 1971.  Burroughs and NCR, the companies that had long dominated the adding machine and cash register businesses respectively, both entered the market in 1956 after buying out a small startup firm — ElectroData and Computer Research Corporation respectively — and managed to remain relevant by creating computers specifically tailored to their preexisting core customers, the banking sector for Burroughs and the retail sector for NCR.  Sperry Rand ended up serving niche markets as well after failing to compete effectively with IBM, experiencing success in fields such as airline reservation systems.  The biggest threat to IBM’s dominance in this period came from two Minnesota companies: Honeywell and Control Data Corporation (CDC).

Unlike the majority of the companies that persisted in the computer industry, Honeywell came not from the office machine business, but from the electronic control industry.  In 1883, a man named Albert Butz created a device called the “damper flapper” that would sense when a house was becoming cold and cause the flapper on a coal furnace to rise, thus fanning the flames and warming the house.  Butz established a company that did business under a variety of names over the next few years to market his innovation, but he had no particular acumen for business.  In 1891, William Sweatt took over the company and increased sales through door-to-door selling and direct marketing.  In 1909 the company introduced the first controlled thermostat, sold as the “Minnesota Regulator,” and in 1912 Sweatt changed the name of the company to the Minnesota Heat Regulator Company.  In 1927, a rival firm, Mark C. Honeywell’s Honeywell Heating Specialty Company of Wabash, Indiana, bought out Minnesota Heat Regulator to form the Honeywell-Minneapolis Regulator Company with Honeywell as President and Sweatt as chairman.  The company continued to expand through acquisitions over the next decade and weathered the Great Depression relatively unscathed.

In 1941, Harold Sweatt, who had succeeded Honeywell as president in 1934, parlayed his company’s expertise in precision measuring devices into several lucrative contracts with the United States military, emerging from World War II as a major defense contractor.  Therefore, the company was approached by fellow defense contractor Raytheon to establish a joint computer subsidiary in 1954.  Incorporated as Datamatic Corporation the next year, the computer company became a wholly-owned subsidiary of Honeywell in 1957 when Raytheon followed so many other companies in exiting the computer industry.  Honeywell delivered its first mainframe, the Datamatic 1000, that same year, but the computer relied on vacuum tubes and was therefore already obsolete by the time it hit the market.  Honeywell temporarily withdrew from the business and went back to the drawing board.  After IBM debuted the 1401, Honeywell triumphantly returned to the business with the H200, which not only took advantage of the latest technology to outperform the 1401 at a comparable price, but also sported full compatibility with IBM’s wildly successful machine, meaning companies could transfer their existing 1401 programs without needing to make any adjustments.  Announced in 1963, the H200 threatened IBM’s control of the low-end of the mainframe market.

norris_cray

William Norris (l) and Seymour Cray, the principle architects of the Control Data Corporation

While Honeywell chipped away at IBM from the bottom of the market, computer startup Control Data Corporation (CDC) — the brainchild of William Norris — threatened to do the same from the top.  Born in Red Cloud, Nebraska, and raised on a farm, Norris became an electronics enthusiast at an early age, building mail-order radio kits and becoming a ham radio operator.  After graduating from the University of Nebraska in 1932 with a degree in electrical engineering, Norris was forced to work on the family farm for two years due to a lack of jobs during the Depression before joining Westinghouse in 1934 to work in the sales department of the company’s x-ray division.  Norris began doing work for the Navy’s Bureau of Ordinance as a civilian in 1940 and enjoyed the work so much that he joined the Naval Reserve and was called to duty at the end of 1941 at the rank of lieutenant commander.  Norris served as part of the CSAW codebreaking operation and became one of the principle advocates for and co-founders of Engineering Research Associates after the war.  By 1957, Norris was feeling stifled by the corporate environment at ERA parent company Sperry Rand, so he left to establish CDC in St. Paul, Minnesota.

Norris provided the business acumen at CDC, but the company’s technical genius was a fellow engineer named Seymour Cray.  Born in Chippewa Falls, Wisconsin, Cray entered the Navy directly after graduating from high school in 1943, serving first as a radio operator in Europe before being transferred to the Pacific theater to participate in code-breaking activities.  After the war, Cray attended the University of Minnesota, graduated with an electrical engineering degree in 1949, and went to work for ERA in 1951.  Cray immediately made his mark by leading the design of the UNIVAC 1103, one of the first commercially successful scientific computers, and soon gained a reputation as an engineering genius able to create simple, yet fast computer designs.  In 1957, Cray and several other engineers followed Norris to CDC.

Unlike some of the more conservative engineers at IBM, Cray understood the significance of the transistor immediately and worked to quickly incorporate it into his computer designs.  The result was CDC’s first computer, the 1604, which was first sold in 1960 and significantly outperformed IBM’s scientific computers.  Armed with Cray’s expertise in computer design Norris decided to concentrate on building the fastest computers possible and selling them to the scientific and military-industrial communities where IBM’s sales force exerted relatively little influence.  As IBM’s Project Stretch floundered — never meeting its performance targets after being released as the IBM 7030 in 1961 — Cray moved forward with his plans to build the fastest computer yet designed.  Released as the CDC 6600 in 1964, Cray’s machine could perform an astounding three million operations per second, three times as many as the 7030 and more than any other machine would be able to perform until 1969, when another CDC machine, the 7600, outpaced it.  Dubbed a supercomputer, the 6600 became the flagship product of a series of high-speed scientific computers that IBM proved unable to match.  While Big Blue was ultimately forced to cede the top of the market to CDC, however, by the time the 6600 launched the company was in the final phases of a product line that would extend the company’s dominance over the mainframe business and ensure competitors like CDC and Honeywell would be limited to only niche markets.

System/360

 system360

The System/360 family of computers, which extended IBM’s dominance of the mainframe market through the end of the 1960s.

 When Tom Watson Jr. finally assumed full control of IBM from his father, he inherited a corporate structure designed to collect as much power and authority in the hands of the CEO as possible.  Unlike Watson Sr., Watson Jr. preferred decentralized management with a small circle of trusted subordinates granted the authority to oversee the day-to-day operation of IBM’s diverse business activities.  Therefore Watson overhauled the company in November 1956, paring down the number of executives reporting directly to him from seventeen to just five, each of whom oversaw multiple divisions with the new title of “group executive.”  He also formed a Corporate Management Committee consisting of himself and the five group executives to make and execute high-level decisions.  While the responsibilities of individual group executives would change from time to time, this new management structure remained intact for decades.

Foremost among Watson’s new group executives was a vice president named Vin Learson.  A native of Boston, Massachusettes, T. Vincent Learson graduated from Harvard with a degree in mathematics in 1935 and joined IBM as a salesman, where he quickly distinguished himself. In 1949, Learson was named sales manager of IBM’s Electric Accounting Machine (EAM) Division, and he rose to general sales manager in 1953.  In April 1954, Tom Watson, Jr. named Learson the director of Electronic Data Processing Machines with a mandate to solidify IBM’s new electronic computer business.  After guiding early sales of the 702 computer and establishing an advanced technology group to incorporate core memory and other improvements into the 704 and 705 computers, Learson received another promotion to vice president of sales for the entire company before the end of the year.  During Watson’s 1956 reorganization, he named Learson group executive of the Military Products, Time Equipment, and Special Engineering Products divisions.

During the reorganization, IBM’s entire computer business fell under the new Data Processing Division overseen by group executive L.H. LaMotte.  As IBM’s computer business continued to grow and diversify in the late 1950s, however, it grew too large and unwieldy to contain within a single division, so in 1959 Watson split the operation in two by creating the Data Systems Division in Poughkeepsie, responsible for large systems, and the General Products Division, which took charge of small systems like the 650 and 1401 and incorporated IBM’s other laboratories in Endicott, San Jose, Burlington, Vermot, and Rochester, Minnesota.  Watson then placed these two divisions, along with a new Advanced Systems Development Division, under Learson’s control, believing him to be the only executive capable of propelling IBM’s computer business forward.

Learson_1

Vin Learson, the IBM executive who spearheaded the development of the System/360

When Learson inherited the Data Systems and General Products Divisions, he was thrust into the middle of an all out war for control of IBM’s computer business.  The Poughkeepsie Laboratory had been established specifically to exploit electronics after World War II and prided itself on being at the cutting edge of IBM’s technology.  The Endicott Laboratory, the oldest R&D division at the company, had often been looked down upon for clinging to older technology, yet by producing both the 650 and the 1401, Endicott was responsible for the majority of IBM’s success in the computer realm.  By 1960, both divisions were looking to update their product lines with more advanced machines.  That September, Endicott announced the 1410, an update to the 1401 that maintained backwards compatibility.  At the same time, Poughkeepsie was hard at work on a new series of four compatible machines designed to serve a variety of business and scientific customers under the 8000 series designation.  Learson, however, wanted to unify the product line from the very low end represented by the 1401 to the extreme high end represented by the 7030 and the forthcoming 8000 computers.  By achieving full compatibility in this manner, IBM could take advantage of economies of scale to drive down the price of individual computer components and software development while also standardizing peripheral devices and streamlining the sales and service organizations that would no longer have to learn multiple systems.  While Learson’s plan was sound in theory, however, forcing two organizations that prided themselves on their independence and competed with each other fiercely to work together would not be easy.

Learson relied heavily on his power as a group executive to transfer employees across both divisions to achieve project unity.  First, he moved Bob Evans, who had been the engineering manager for the 1401 and 1410, from Endicott to Poughkeepsie as the group’s new systems development manager.  Already a big proponent of compatibility, Evans unsurprisingly recommended that the 8000 project be cancelled and a cohesive product line spanning both divisions be initiated in its place.  The lead designer of the 8000 series, Frederick Brooks, vigorously opposed this move, so Learson replaced Brooks’s boss with another ally, Jerrier Haddad, who had led the design of the 701 and recently served as the head of Advanced Systems Development.  Haddad sided with Evans and terminated the 8000 project in May 1961.  Strong resistance remained in some circles, however, most notably from General Products Division head John Haanstra, so in October 1961, Learson assembled a task group called SPREAD (Systems, Planning, Review, Engineering, and Development) consisting of thirteen senior engineering and marketing managers to determine a long-term strategy for IBM’s data processing line.

On December 28, the SPREAD group delivered its final proposal to the executive management committee.  In it, they outlined a series of five compatible processors representing a 200-fold range in performance.  Rather than incorporate the new integrated circuit, the group proposed a proprietary IBM design called Solid Logic Technology (SLT), in which the discrete components of the circuit were mounted on a single ceramic substrate, but were not fully integrated.  By combining the five processors with SLT circuits and core memories of varying speeds, nineteen computer configurations would be possible that would all be fully compatible and interchangeable and could be hooked up to 40 different peripheral devices.  Furthermore, after surveying the needs of business and scientific customers, the SPREAD group realized that other than floating-point capability for scientific calculations, the needs of both customers were nearly identical, so they chose to unify the scientific and business lines rather then market different models for each.  Codenamed the New Product Line (NPL), the SPREAD proposal would allow IBM customers to buy a computer that met their current needs and then easily upgrade or swap components as their needs changed over time at a fraction of the cost of a new system without having to rewrite all their software or replace their peripheral devices.  While not everyone was convinced by the presentation, Watson ultimately authorized the NPL project.

The NPL project was perhaps the largest civilian R&D operation ever undertaken to that point.  Development costs alone were $500 million, and when tooling, manufacturing, and other expenses were taken into account, the cost was far higher.  Design of the five processor models was spread over three facilities, with Poughkeepsie developing the three high-end systems, Endicott developing the lowest-end system, and a facility in Hursley, England, developing the other system.  At the time, IBM manufactured all its own components as well, so additional facilities were charged with churning out SLT circuits, core memories, and storage systems.  To assemble all the systems, IBM invested in six new factories.  In all, IBM spent nearly $5 billion to bring the NPL to market.

To facilitate the completion of the project, Watson elevated two executives to new high level positions: Vin Learson assumed the new role of senior vice president of sales, and Watson’s younger brother, Arthur, who for years had run IBM’s international arm, the World Trade Corporation, was named senior vice president of research, development, and manufacturing.  This new role was intended to groom the younger Watson to assume the presidency of IBM one day, but the magnitude of the NPL project coupled with Watson’s inexperience in R&D and manufacturing ultimately overwhelmed him.  As the project fell further and further behind schedule, Learson ultimately had to replace Arthur Watson in order to see the project through to completion.  Therefore, it was Learson who assumed the presidency of IBM in 1966 while Watson assumed the new and largely honorary role of vice chairman.  His failure to shepherd the NPL project ended any hope Arthur Watson had of continuing the Watson family legacy of running IBM, and he ultimately left the company in 1970 to serve as the United States ambassador to France.

In late 1963, IBM began planning the announcement of its new product line,  which now went by the the name System/360 — a name chosen because it represented all the points of a compass and emphasized that the product line would fill the needs of all computer users.  Even at this late date, however, acceptance of System/360 within IBM was not assured.  John Haanstra continued to push for an SLT upgrade to the existing 1401 line to satisfy low-end users, which other managers feared would serve to perpetuate the incompatibility problem plaguing IBM’s existing product line.  Furthermore, IBM executives struggled over whether to announce all the models at once and thus risk a significant drop in orders for older systems during the transition period, or phase in each model over the course of several years.  All debate ended when Honeywell announced the H200.  Faced with losing customers to more advanced computers fully compatible with IBM’s existing line,  Watson decided in March 1964 to scrap the improved 1401 and launch the entire 360 product line at once.

On April 7, 1964, IBM held press conferences in sixty-three cities across fourteen countries to announce the System/360 to the world.  Demand soon far exceeded supply as within the first two years that System/360 was on the market IBM was only able to fill roughly 4,500 of 9,000 orders.  Headcount at the company rose rapidly as IBM rushed to bring new factories online in response.  In 1965, when actual shipments of the System/360 were just beginning, IBM controlled 65 percent of the computer market and had revenues of $2.5 billion.  By 1967, as IBM ramped up to meet insatiable 360 demand, the company employed nearly a quarter of a million people and raked in $5 billion in revenues.  By 1970, IBM had an install base of 35,000 computers and held an ironclad grip on the mainframe industry with a marketshare between seventy and eighty percent; the next year company earnings surpassed $1 billion for the first time.

As batch processing mainframes, the System/360 line and its competitors did not serve as computer game platforms or introduce technology that brought the world closer to a viable video game industry.  System/360 did, however, firmly establish the computer within corporate America and solidified IBM’s place as a computing superpower while facilitating the continuing spread of computing resources and the evolution of computer technology.  Ultimately, this process would culminate in a commercial video game industry in the early 1970s.

Advertisements

Historical Interlude: The Birth of the Computer Part 1, the Mechanical Age

Before continuing the history of video gaming with the activities of the Tech Model Railroad Club and the creation of the first truly landmark computer game, Spacewar!, it is time to pause and present the first of what I referred to in my introductory post as “historical interludes.”  In order to understand why the video game finally began to spread in the 1960s, it is important to understand the evolution of computer technology and the spread of computing resources.  As we shall see, the giant mainframes of the 1940s and 1950s were neither particularly interactive nor particularly accessible outside of a small elite, which generally prevented the creation of programs that provided feedback quickly and seamlessly enough to create an engaging play experience while also generally discouraging projects not intended to aid serious research or corporate data processing.  By the time work on Spacewar! began in 1961, however, it was possible to occasionally divert computers away from more scholarly pursuits and design a program interesting enough to hold the attention of players for hours at a time.  The next four posts will describe how computing technology reached that point.

Note: Unlike my regular posts, historical interlude posts will focus more on summarizing events and less on critiquing sources or stating exactly where every last fact came from.  They are meant to provide context for developments in video game history, and the information within them will usually be drawn from a small number of secondary sources and not be researched as thoroughly as the video game history posts.  Much of the material in this post is drawn from Computer: A History of the Information Machine by Martin Campbell-Kelly and William Aspray, The Maverick and His Machine: Thomas Watson, Sr. and the Making of IBM by Kevin Maney, and The Innovaters: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution by Walter Isaacson.

Defining the Computer

766px-Human_computers_-_Dryden

Human computers working at the NACA High Speed Flight Station in 1949

Before electronics, before calculating machines, even before the Industrial Revolution there were computers, but the term did not mean the same thing it does today.  Before World War II and the emergence of the first electronic digital computers, a computer was a person who performed calculations, generally for a specialized purpose.  As we shall see, most of the early computers were created specifically to perform calculations, so as they grew to function with less need for human intervention, they naturally came to be called “computers” themselves after the profession they quickly replaced.

The computer profession originated after the development of the first mathematical tables in the 16th and 17th centuries such as the logarithmic tables designed to perform complex mathematical operations solely through addition and subtraction and the trigonometric tables designed to simplify the calculation of angles for fields like surveying and astronomy.  Computers were the people who would perform the calculations necessary to produce these tables.  The first permanent table-making project was established in 1766 by Nevil Maskelyne to produce navigational tables that were updated and published annually in the Nautical Almanac, which is still issued today.

Maskelyne relied on freelance computers to perform his calculations, but with the dawning of the Industrial Revolution, a French mathematician named Gaspard de Prony established what was essentially a computing factory in 1791 modeled after the division of labor principles espoused by Adam Smith in the Wealth of Nations to compile accurate logarithmic and trigonometric tables to aid in performing a new survey of the entirety of France as part of a project to reform the property tax system.  De Prony relied on a small number of skilled mathematicians to define the mathematical formulas and a group of middle managers to organize the tables, so his computers needed only a knowledge of basic addition and subtraction to do their work, reducing the computer to an unskilled laborer.  As the Industrial Revolution progressed, unskilled workers in most fields moved from using simple tools to mechanical factory machinery to do their work, so it comes as no surprise that one enterprising individual would attempt to bring a mechanical tool to computing as well.

Charles Babbage and the Analytical Engine

charles_babbage

Charles Babbage, creator of the first computer design

Charles Babbage was born in 1791 in London.  The son of a banker, Babbage was a generally indifferent student who bounced between several academies and private tutors, but did gain a love of mathematics at an early age and attained sufficient marks to enter Trinity College, Cambridge, in 1810.  While Cambridge was the leading mathematics institution in England, the country as a whole had fallen behind the Continent in sophistication, and Babbage soon came to realize he knew more about math than his instructors.  In an attempt to rectify this situation, Babbage and a group of friends established the Analytical Society to reform the study of mathematics at the university.

After leaving Cambridge in 1814 with a degree in mathematics from Peterhouse, Babbage settled in London, where he quickly gained a reputation as an eminent mathematical philosopher but had difficulty finding steady employment.  He also made several trips to France beginning in 1819, which is where he learned of De Prony’s computer factory.  In 1820, he joined with John Herschel to establish the Astronomical Society and took work supervising the creation of star tables.  Frustrated by the tedious nature of fact-checking the calculations of the computers and preparing the tables for printing, Babbage decided to create a machine that would automate the task.

The Difference Engine would consist of columns of wheels and gears each of which represented a single decimal place.  Once the initial values were set for each column — which would be determined by setting a polynomial equation in column one and then using a series of derivatives to establish the value of the other columns — the machine would use Newton’s method of divided differences (hence its name) to perform addition and subtraction functions automatically, complete the tables, and then send them to a printing device.  Babbage presented his proposed machine to the Royal Society in 1822 and won government funding the next year by arguing that a maritime industrial nation required the most accurate navigational tables possible and that the Difference Engine would be both cheaper to operate and more accurate than an army of human computers.

The initial grant of £1,500 quickly proved insufficient for the task of creating the machine, however, which was at the very cutting edge of machine tool technology and therefore extremely difficult to fashion components for.   The government continued to fund the project for over a decade, however, ultimately providing £17,000.  By 1833, Babbage was able to construct a miniature version of the Difference Engine that lacked sufficient capacity to actually create tables but did prove the feasibility of the project.  The next year, however, he unwittingly sabotaged himself by proposing an even more grand device to the government, the Analytical Engine, thus undermining the government’s faith in Babbage’s ability to complete the original project and causing it to withdraw funding and support.  A fully working Difference Engine to Babbage’s specification would not be built until the late 1980s, by which time it was a historical curiosity rather than a useful machine.  In the meantime, Babbage turned his attention to the Analytical Engine, the first theorized device with the capabilities of a modern computer.

10303265

A portion of Charles Babbage’s Analytical Engine, which remained unfinished at his death

The Difference Engine was merely a calculating machine that performed addition and subtraction, but the proposed Analytical Engine was a different beast.  Equipped with an arithmetical unit called the “mill” that exhibited many of the features of a modern central-processing unit (CPU), the machine would be capable of performing all four basic arithmetic operations.  It would also possess a memory, able to store 1,000 numbers of up to 40 digits each.  Most importantly, it would be program controlled, able to perform a wide variety of tasks based on instructions inputted into the machine.  These programs would be entered using punched cards, a recording medium first developed in 1725 by Basile Bouchon and Jean-Baptiste Falcon to automate textile looms that was greatly improved and popularized by Joseph Marie Jacquard in 1801 for the loom that bears his name.  Results could be outputted to a printer or a curve plotter.  By employing separate memory and computing elements and establishing a method of program control, Babbage outlined the first machine to include all the basic hallmarks of the modern computer.

Babbage sketched out the design of his Analytical Engine between 1834 and 1846.  He then halted work on the project for a decade before returning to the concept in 1856 and continuing to tinker with it right up until his death in 1871.  Unlike with the Difference Engine, however, he was never successful in securing funding from a British Government that remained unconvinced of the device’s utility — as well as unimpressed by Babbage’s inability to complete the first project it had commissioned from him — and thus failed to build a complete working unit.  His project did attract attention in certain circles, however.  Luigi Manabrea, a personal friend and mathematician who later became Prime Minister of Italy, invited Babbage to give a presentation on his Analytical Engine at the University of Turin in 1842 and subsequently published a transcription of the lecture in French.  This account was translated into English over a nine month period in 1842-43 by another friend of Babbage, Ada Lovelace, the daughter of the celebrated poet Lord Byron.

Ada Lovelace has been a controversial figure in computer history circles.  Born in 1815, she never knew her celebrated father, whom her mother fled shortly after Ada’s birth.  She possessed what appears to have been a decent mathematical mind, but suffered from mental instability and delusions of grandeur that caused her to perceive greater abilities than she actually possessed.  She became a friend and student of noted mathematician Mary Somerville, who was also a friend of Babbage.  It was through this connection that she began attending Babbage’s regular Saturday evening salons in 1834 and came to know the man.  She tried unsuccessfully to convince him to tutor her, but they remained friends and he was happy to show off his machines to her.  Lovelace became a fervent champion of the Analytical Engine and attempted to convince Babbage to make her his partner and publicist for the machine.  It was in this context that she not only took on the translation of the Turin lecture in 1842, but at Babbage’s suggestion also decided to appended her own description of how the Analytical Engine differed from the earlier Difference Engine alongside some sample calculations using the machine.

In a section entitled “Notes by the Translator,” which ended up being longer than the translation itself, Lovelace articulated several important general principles of computing, including the recognition that a computer could be programmed and reprogrammed to take on a variety of different tasks and that it could be set to tasks beyond basic math through the use of symbolic logic.  She also outlined a basic structure for programming on the Analytical Engine, becoming the first person to articulate common program elements such as recursive loops and subroutines.  Finally, she included a sample program to calculate a set of Bernoulli numbers using the Analytical Engine.  This last feat has led some people to label Lovelace the first computer programmer, though in truth it appears Babbage created most of this program himself.  Conversely, some people dismiss her contributions entirely, arguing that she was being fed all of her ideas directly by Babbage and had little personal understanding of how his machine worked.  The truth is probably somewhere in the middle.  While calling her the first programmer is probably too much of a stretch, as Babbage had already devised several potential programs himself by that point and contributed significantly to Lovelace’s as well, she still deserves recognition for being the first person to articulate several important elements of computer program structure.  Sadly, she had no chance to make any further mark on computer history, succumbing to uterine cancer in 1852 at the age of thirty-six.

Towards the Modern Office

cb000184_1907_Office_with_Burroughs_Model_6_OM

An Office in the B-Logo Business Systems Department in 1907, showcasing some of the mechanical equipment revolutionizing clerical work in the period.

Ultimately, the Analytical Engine proved too ambitious, and the ideas articulated by Babbage would have to wait for the dawn of the electronics era to become practical.  In the meantime, however, the Industrial Revolution resulted in great advances in office automation that would birth some of the most important companies of the early computer age.  Unlike the human computer industry and the innovative ideas of Babbage, however, the majority of these advances came not from Europe, but from the United States.

Several explanations have been advanced to explain why the US became the leader in office automation.  Certainly, the country industrialized later than the European powers, meaning businessmen were not burdened with outmoded theories and traditions that hindered innovations in the Old World.  Furthermore, the country had a long history of interest in manufacturing efficiency, dating back as far as Eli Whitney and his concept of using interchangeable parts in firearms in 1801 (Whitney’s role in the creation of interchangeable parts is usually exaggerated, as he was not the first person to propose the method and was never actually able to implement it himself, but he was responsible for introducing the concept to the US Congress and therefore still deserves some credit for its subsequent adoption in the United States).  By the 1880s, this fascination with efficiency had evolved into the “scientific management” principles of Frederick Taylor that aimed to identify best practices through rational, empirical study and employ standardization and training to eliminate waste and inefficiency on the production line.  Before long, these ideals had penetrated the domain of the white-collar worker through the concept of “office rationalization,” in which managers introduced new technologies and systems to maximize productivity in that setting as well.

The first major advance in the drive for office automation was the invention of a practical typewriter.  While several inventors created typing machines in the early nineteenth century, none of these designs gained any traction in the marketplace because using them was slower than writing out a document by hand.  In 1867, however, a retired newspaper editor named Christopher Latham Sholes was inspired by an article in Scientific American describing a mechanical typing device to create one of his own.  By the next year Sholes, with the help of amateur mechanic Carlos Glidden and printer Samuel Soule, had created a prototype for a typing machine using a keyboard and type-basket design that finally allowed typing at a decent speed.  After Soule left the project, Sholes sent typewritten notes to several financiers in an attempt to raise capital to refine the device and prepare for mass production.  A Pennsylvania businessman named James Densmore answered the call and provided the funding necessary to make important improvements such as replacing a frame to hold the paper with a rotating drum and changing the layout of the keyboard to the familiar QWERTY orientation — still used on computer keyboards to this day — to cut down on jamming by spacing out commonly used letters in the typing basket.

After several failed attempts to mass produce the typewriter through smaller companies in the early 1870s, Densmore was able to attract the interest of Philio Remington of the small-arms manufacturer E. Remington & Sons, which had been branching out into other fields such as sewing machines and fire engines in the aftermath of the U.S. Civil War.  First introduced by Remington in 1874, the typewriter sold slowly at first, but as office rationalization took hold in the 1880s, businesses started flocking to the machine.  By 1890 Remington had a virtual monopoly on the new industry and was producing 20,000 machines a year.  In addition to establishing the typewriter in the office, Remington also pioneered the idea of providing after-market service for office products, opening branch offices in major cities where people could not only buy typewriters, but also bring them in for repairs.

With typed loose-leaf pages replacing the traditional “letter book” for office correspondence, companies soon found it necessary to adopt new methods for storing and retrieving documents.  This led to the development of vertical filing using hanging folders stored in upright cabinets, which was first publicly demonstrated by Melville Dewey at the Chicago World’s Fair in 1893.  While vertical filing proved superior to the boxes and drawers previously employed in the workplace, however, it proved woefully inefficient once companies evolved from tracking hundreds of records to tens of thousands.  This time the solution came from James Rand, Sr., a clerk from Tonawanda, New York, who patented a visible index system in which colored signal strips and tabs would allow specific file folders to be found quickly and easily.  Based on this invention, the clerk established the Rand Ledger Company in 1898.  His son, James Rand, Jr., joined the business in 1908 and then split off from his father in 1915 after a dispute over advertising spending to market his own record retrieval system based around index cards called the Kardex System.  As the elder Rand neared retirement a decade later, his wife orchestrated a reconciliation between him and his son, and their companies merged to form the Rand Kardex Company in 1925.  Two years later, Rand Kardex merged with the Remington Typewriter Company to form Remington Rand,  which became the largest business machine company in the world.

burroughs

A Burroughs “adder-lister,” one of the first commercially successful mechanical calculators

A second important invention of the late nineteenth century was the first practical calculator.  Mechanical adding machines had existed as far back as the 17th century when Blaise Pascal completed his Pascaline in 1645 and Gottfriend Liebnitz invented the first calculator capable of performing all four basic functions, the Stepped Reckoner, in 1692, but the underlying technology remained fragile and unreliable and therefore unsuited to regular use despite continued refinements over the next century.  In 1820, the calculator was commercialized for the first time by Thomas de Colmar, but production of his Arithmometer lasted only until 1822.  After making several changes, Thomas began offering his machine to the public again in 1851, but while the Arithmometer gained a reputation for both sturdiness and accuracy, production never exceeded a few dozen a year over the next three decades as the calculator remained too slow and impractical for use in a business setting.

The main speed bottleneck of the early adding machines was that they all required the setting of dials and levers to use, making them far more cumbersome for bookkeepers than just doing the sums by hand.  The man who first solved this problem was Dorr Felt, a Chicago machinist who replaced the dials with keys similar to those found on a typewriter.  Felt’s Comptometer, completed in 1885, arranged keys labelled 0 to 9 across ten columns that each corresponded to a single digit of a number, allowing figures to be entered rapidly with just one hand.  In 1887, Felt formed the Felt & Tarrant Manufacturing Company with a local manufacturer named Robert Tarrant to mass produce the Comptometer, and by 1900 they were selling over a thousand a year.

While Felt remained important in the calculator business throughout the early twentieth century, he was ultimately eclipsed by another inventor.  William S. Burroughs, the son of a St. Louis mechanic, was employed as a clerk at a bank but suffered from health problems brought on by spending hours hunched over columns adding figures.  Like Felt, he decided to create a mechanical adding machine using keys to improve this process, but he also added another key advance to his “adder-lister,” the ability to print the numbers as they were entered so there would be a permanent record of every financial transaction.  In 1886, Burroughs established the American Arithmometer Company to market his adding machine, which was specifically targeted at banks and clearing houses and was selling at a rate of several hundred a year by 1895.  Burroughs died in 1898, but the company lived on and relocated to Detroit in 1904 after it outgrew its premises in St. Louis, changing its name to the Burroughs Adding Machine Company in honor of its founder.  At the time of the move, Burroughs was selling 4,500 machines a year.  Just four years later, that number had risen to 13,000.

John H. Patterson

John H. Patterson, founder of the National Cash Register Company (NCR)

The adding machine was one of two important money management devices invented in this period, with the other being the mechanical cash register.  This device was invented in 1879 by James Ritty, a Dayton saloon owner who feared his staff was stealing from him, and constructed by his brother, John.  Inspired by a tool that counted the revolutions of the propeller on a steamship, “Ritty’s Incorruptible Cashier” required the operator to enter each transaction using a keypad, displayed each total entered for all to see, and printed the results on a roll of paper, allowing the owner to compare the cash taken in to the recorded amounts.  Ritty attempted to interest other business owners in his machine, but proved unsuccessful and ultimately sold the business to Jacob Eckert of Cincinnati in 1881.  Eckert added a cash drawer to the machine and established the National Manufacturing Company, but he was barely more successful than the Rittys.  Therefore, in 1884 he sold out to John Patterson, who established the National Cash Register Company (NCR).

John Henry Patterson was born on a farm outside Dayton, Ohio, and entered the coal trade after graduating from Dartmouth College.  While serving as the general manager of the Southern Coal and Iron Company, Patterson was tasked with running the company store and became one of Ritty’s earliest cash register customers.  After being outmaneuvered in the coal trade, Patterson sold his business interests and used the proceeds to buy NCR.  A natural salesman, Patterson created and/or popularized nearly every important modern sales practice while running NCR.  He established sales territories and quotas for his salesmen, paid them a generous commission, and rewarded those who met their quotas with an annual sales convention.  He also instituted formal sales training and produced sales literature that included sample scripts, creating the first known canned sales pitch.  Like Remington, he established a network of dealerships that provided after market services to build customer loyalty, but he also advertised through direct mailings, another unusual practice.  Understanding that NCR could only stay on top of the business by continuing to innovate, Patterson also established an “innovations department” in 1888, one of the earliest permanent corporate research & development organizations in the world.  In an era when factory work was mostly still done in crowded “sweatshops,” Patterson constructed a glass-walled factory that let in ample light set amid beautifully landscaped grounds.

While Patterson seemed to genuinely care for the welfare of his workers, however, he also had a strong desire to control every aspect of their lives.  He manipulated subordinates constantly, hired and fired individuals for unfathomable reasons, instituted a strict physical fitness regimen that all employees were expected to follow, and established rules of conduct for everything from tipping waiters to buying neckties.  For all his faults, however, his innovative sales techniques created a juggernaut.  By 1900, the company was selling 25,000 cash registers a year, and by 1910 annual sales had risen to 100,000.  By 1928, six years after Patterson’s death, NCR was the second largest office-machine supplier in the world with annual sales of $50 million, just behind Remington Rand at $60 million and comfortably ahead of number three Burroughs at $32 million.  All three companies were well ahead of the number four company, a small firm called International Business Machines, or IBM.

Computing, Tabulating, and Recording

IBM, which eventually rose to dominance in the office machine and data processing industries, cannot be traced back to a single origin, for it began as a holding company that brought together several firms specializing in measuring and processing information.  There were three key people responsible for shaping the company in its early years: Herman Hollerith, Charles Flint, and Tom Watson, Sr.

416px-Hollerith

Herman Hollerith, whose tabulating machine laid the groundwork for the company that became IBM

Born in Buffalo, New York, in 1860, Herman Hollerith pursued an education as a mining engineer, culminating in a Ph.D from Columbia University in 1890.  One of Hollerith’s professors at Columbia also served as an adviser to the Bureau of the Census in Washington, introducing Hollerith to the largest data processing organization in the United States.  At the time, the Census Bureau was in crisis as traditional methods of processing census forms failed to keep pace with a growing population.  The 1880 census, processed entirely by hand using tally sheets, took the bureau seven years to complete.  With the population of the country continuing to expand rapidly, the 1890 census appeared poised to take even longer.  To attack this problem, the new superintendent of the census, Robert Porter, held a competition to find a faster and more efficient way to count the U.S. population.

Three finalists demonstrated solutions for Porter in 1889.  Two of them created systems using colored ink or cards to allow data to be sorted more efficiently, but these were still manual systems.  Hollerith on the other hand, inspired by the ticket punches used by train conductors, developed a system in which the statistical information was recorded on punched cards that were quickly tallied by a tabulating machine of his own design.  Cards were placed in this machine one at a time and pressed with an apparatus containing 288 retractable pins.  Any pin that encountered a hole in the card would complete an electrical circuit and advance one of forty tallies.  Using Hollerith’s machines, the Census Bureau was able to complete its work in just two and a half years.

As the 1890 census began to wind down, Hollerith re-purposed his tabulating system for use by businesses and incorporated the Tabulating Machine Company in December 1896.  He remained focused on the census, however, until President McKinley’s assassination in 1901 resulted in the appointment of a new superintendent that chose to go with a different company for 1910.  In the meantime, Hollerith refined his system by implementing a three-machine setup consisting of a keypunch to put the holes in the cards, a tabulator to tally figures, and a sorting machine to place the cards in sequence.  By 1911, Hollerith had roughly one hundred customers and the business was continuing to expand, but his health was failing, leading him to entertain an offer to sell from an influential financier named Charles Flint.

Charles_Ranlett_Flint

Charles Rantlett Flint, the man who forged IBM

Charles Rantlett Flint was a self-made man born into a family of shipbuilders that started his first business at 18 on the docks of his hometown of Thomaston, Maine.  From there, he secured a job with a trader named William Grace by offering to work for free.  In 1872, Grace made Flint a partner in his new W.R. Grace & Co. shipping and trading firm, which still exists today as a chemical and construction materials conglomerate.  During this period, Flint acted as a commission agent in South America dealing in both arms and raw materials.  He also became keenly interested in new technologies such as the automobile, light bulb, and airplane.

In 1892, Flint leveraged his international trading contacts to pull together a number of rubber exporters into a trust called U.S. Rubber.  This began a period of intense monopoly building by Flint across a number of industries.  By 1901, Flint’s growing roster of trusts included the International Time Recording Company (ITR) of Endicott, New York, based around the recently invented time clock that allowed employers to easily track the hours worked by their employees, and the Computing Scale Company of America of Dayton, Ohio, based around scales that would both weigh items by the pound and compute their total cost.  While ITR proved modestly successful, however, the Computing Scale Company ended an abject failure.  In an attempt to salvage his poorly performing concern, Flint decided to define a new larger market of information recording machines for businesses and merge ITR and Computing Scale under the umbrella of a single holding company.  Feeling Hollerith’s company fit well into this scheme, Flint purchased it as well in 1911 and folded the three companies into the new Computing-Tabulating-Recording Company (C-T-R).  The holding company approach did not work, however, as C-T-R was an unwieldy organization consisting of three subsidiaries spread across five cities with managers that ignored each other at best and actively plotted against each other at worst.  Furthermore, the company was saddled with a large debt and its component parts could not leverage their positions in a trust to create superior integration or economies of scale because their products and customers were too different.  By 1914, C-T-R was worth only $3 million and carried a debt of $6.5 million.  Flint’s experiment had clearly failed, so he brought in a new general manager to turn the company around.  That man was Thomas Watson, Sr.

thomas_watson

Thomas Watson, Sr., the man who built IBM into a corporate giant

By the time Flint hired Watson for C-T-R, he already had a reputation as a stellar salesman, but was also tainted by a court case brought over monopolistic practices.  Born on a farm in south central New York State, Watson tried his hand as both a bookkeeper and a salesman with various outfits, but had trouble holding down steady employment.  After his latest venture failed in 1896, a butcher’s shop in Buffalo, Watson trudged down to the local NCR office to transfer the installment payments on the store’s cash register to the new owner.  While there, he struck up a conversation with a salesman named John Range and kept pestering him periodically until Range finally offered him a job.  Within nine months, Watson went from sales apprentice to full sales agent as he finally seemed to find his calling.  Four years later, he was transferred to the struggling NCR branch in Rochester, New York, which he managed to turn around.  This brought him to the attention of John Patterson in Dayton, who tapped Watson for a special assignment.

By 1903, when Patterson summoned Watson, NCR was experiencing fierce competition from a growing second-hand cash register market.  NCR cash registers were both durable and long-lasting, so enterprising businessmen had begun buying up used cash registers from stores that were upgrading or going out of business and then undercutting NCR’s prices on new machines.  For the controlling monopolist Patterson, this was unacceptable.  His solution was to create his own used cash register business that would buy old machines for higher prices than other outlets and sell them cheaper, making up the lost profits through funding directly from NCR.  Once the competition had been driven out of business, prices could be raised and the business would start turning a profit.  Patterson tapped Watson to control this business.  For legal reasons, Patterson kept the connection between NCR and the new Watson business a secret.

Between 1903 and 1908, Watson slowly expanded his used cash register business across the country, creating an excellent new profit-center for NCR.  His reward was a posting back at headquarters in Dayton as an assistant sales manager, where he soon became Patterson’s protégé and absorbed his innovative sales techniques.  By 1910, Watson had been promoted to sales manager, where his personable and less-controlling management style created a welcome contrast to Patterson and encouraged flexibility and creativity among the 900-strong NCR sales force, helping to double the company’s 1909 sales within two years.

As quickly as Watson rose at NCR, however, he fell even faster.  In 1912 the Taft administration, amid a general crusade against corporate trusts, brought criminal charges against Patterson, Watson, and other high-ranking NCR executives for violations of the Sherman Anti-Trust Act.  At the end of a three-month trial, Watson was found guilty along with Patterson and all but one of their co-defendants on February 13, 1913 and now faced the prospect of jail time.  Worse, the ordeal appears to have soured the ever-changeable Patterson on the executives indicted with him, as they were all chased out of the company within a year.  Watson himself departed NCR in November 1913 after 17 years of service.  Some accounts state that Watson was fired, but it appears that the separation was more by mutual agreement.  Either way, it was a humbled and disgraced Watson that Charles Flint tapped to save C-T-R in early 1914.  Things began looking up the next year, however, when an appeal resulted in an order for a new trial.  All the defendants save Watson settled with the government, which decided pursuing Watson alone was not worth the effort.  Thus cleared of all wrongdoing, Watson was elevated to the presidency of C-T-R.

Watson saved and reinvented C-T-R through a combination of Patterson’s techniques and his own charisma and personality.  He reinvigorated the sales force through quotas, generous commissions, and conventions much like Patterson.  A lover of the finer things in life, he insisted that C-T-R staff always be impeccably dressed and polite, shaping the popular image of the blue-suited IBM sales person that would last for decades.  He changed the company culture by emphasizing the importance of every individual in the corporation and building a sense of company pride and loyalty.  Finally, he was fortunate to take over at a time when the outbreak of World War I and a booming U.S. economy led to increased demand for tabulating machines both from businesses and the U.S. government.  Between 1914 and 1917, revenues doubled from $4.2 million to $8.3 million, and by 1920 they had reached $14 million.

What really set IBM apart, however, was the R&D operation Watson established based on the model of NCR’s innovations department.  At the time Watson arrived, C-T-R remained the leading seller of tabulating machines, but the competition was rapidly gaining market share on the back of superior products.  Hollerith, who remained as a consultant to C-T-R after Flint bought his company, showed little interest in developing new products, causing the company’s technology to fall further and further behind.  The company’s only other senior technical employee, Eugene Ford, occasionally came up with improvements, but he could not actually put them into practice without the approval of Hollerith, which was rarely forthcoming.  Watson moved Ford into a New York loft and ordered him to begin hiring additional engineers to develop new products.

Ford’s first hire, Clair Lake, developed the company’s first printing tabulator in the early 1920s, which gave the company a machine that could rival the competition in both technology and user friendliness.  Another early hire, Fred Carroll from NCR, developed the Carroll Press that allowed C-T-R to cheaply mass produce the punched cards used in the tabulating machines and therefore enjoy a huge profit margin on the product.  In the late 1920s, Lake created a new patentable punched-card design that would only work in IBM machines, which locked-in customers and made them unlikely to switch to a competing company and have to redo millions of cards.  Perhaps the most important hire was James Bryce, who joined the company in 1917, rose to chief engineer in 1922, and ended up with over four hundred patents to his name.

After a small hiccup in 1921-22 as the U.S. endured a small recession, C-T-R, which Watson renamed International Business Machines (IBM) in 1924, experienced rapid growth for the rest of the decade, reaching $20 million in revenue by 1928.  While this placed IBM behind Remington Rand, NCR, and Burroughs, the talented R&D group and highly effective sales force built by Watson left the company perfectly poised to rise to a dominant position in the 1930s and subsequently conquer the new computer market of the 1950s.