The computer began life in the 1940s as a scientific device designed to perform complex calculations and solve difficult equations. In the 1950s, the United States continued to fund scientific computing projects at government organizations, defense contractors, and universities, many of them based around the IAS architecture derived from the EDVAC and created by John von Neumann’s team at Princeton. Some of the earliest for-profit computer companies emerged out of this scientific work such as the previously discussed Engineering Research Associates, the Hawthorne, California-based Computer Research Corporation, which spun out of a Northrup Aircraft project to build a computer for the Air Force in 1952, and the Pasadena-based ElectroData Corporation, which spun out of the Consolidated Engineering Corporation that same year. All of these companies remained fairly small and did not sell many computers.
Instead, it was Remington Rand that identified the future path of computing when it launched the UNIVAC I, which was adopted by businesses to perform data processing. Once corporate America understood the computer to be a capable business machine and not just an expensive calculator, a wide array of office equipment and electronics companies entered the computer industry in the mid 1950s, often buying out the pioneering computer startups to gain a foothold. Remington Rand dominated this market at first, but as discussed previously, IBM soon vaulted ahead as it acquired computer design and manufacturing expertise participating in the SAGE project and unleashed its world-class sales and service organizations. Remington Rand attempted to compensate by merging with Sperry Gyroscope, which had both a strong relationship with the military and a more robust sales force, to form Sperry Rand in 1955, but the company never seriously challenged IBM again.
While IBM maintained its lead in the computer industry, however, by the beginning of the 1960s the company faced threats to its dominance at both the low end and the high end of the market from innovative machines based around new technologies like the transistor. Fearing these new challengers could significantly damage IBM, Tom Watson Jr. decided to bet the company on an expensive and technically complex project to offer a complete line of compatible computers that could not only be tailored to a customer’s individual’s needs, but could also be easily modified or upgraded as those needs changed over time. This gamble paid off handsomely, and by 1970 IBM controlled well over seventy percent of the market, with most of the remainder split among a group of competitors dubbed the “seven dwarfs” due to their minuscule individual market shares. In the process, IBM succeeded in transforming the computer from a luxury item only operated by the largest firms into a necessary business appliance as computers became an integral part of society.
Note: Yet again we have a historical interlude post that summarizes key events outside of the video game industry that nevertheless had a significant impact upon it. The information in this post is largely drawn from Computer: A History of the Information Machine by Martin Campbell-Kelly and William Aspray, A History of Modern Computing by Paul Ceruzzi, Forbes Greatest Technology Stories: Inspiring Tales of the Entrepreneurs and Inventors Who Revolutionized Modern Business by Jeffrey Young, IBM’s Early Computers by Charles Bashe, Lyle Johnson, John Palmer, and Emerson Pugh. and Building IBM: Shaping an Industry and Its Technology by Emerson Pugh.
IBM Embraces the Transistor
The IBM 1401, the first mainframe to sell over 10,000 units
Throughout most of its history in computers, IBM has been known more for evolution than revolution. Rarely first with a new concept, IBM excelled at building designs based around proven technology and then turning its sales force loose to overwhelm the competition. Occasionally, however, IBM engineers have produced important breakthroughs in computer design. Perhaps none of these were more significant than the company’s invention of the disk drive.
On the earliest computers, mass data storage was accomplished through two primary methods: magnetic tape or magnetic drums. Tape could hold a large amount of data for the time, but it could only be read serially, and it was a fragile medium. Drums were more durable and had the added benefit of being random access — that is any point of data on the drum could be read at any time — but they were low capacity and expensive. As early as the 1940s, J. Presper Eckert had explored using magnetic disks rather than drums, which would be cheaper and feature a greater storage capacity due to a larger surface area, but there were numerous technical hurdles that needed to be ironed out. Foremost among these was the technology to read the disks. A drum memory array used rigid read-write heads that could be readily secured, though at high cost. A disk system required a more delicate stylus to read the drives, and the constant spinning of the disk created a high risk that the stylus would make contact with and damage it.
The team that finally solved these problems at IBM worked not at the primary R&D labs in Endicott or Poughkeepsie, but rather a relatively new facility in San Jose, California, led by IBM veteran Reynold Johnson that had been established in 1952 as an advanced technologies research center free of the influence of the IBM sales department, which had often shut down projects with no immediate practical use. One of the lab’s first projects was to improve storage for IBM’s existing tabulating equipment. This task fell to a team led by Arthur Critchlow, who decided based on customer feedback to develop a new random access solution that would allow IBM’s tabulators and low-end computers to not only be useful for data processing, but also for more complicated jobs like inventory management. After testing a wide variety of memory solutions, Critchlow’s team settled on the magnetic disk as the only viable solution, partially inspired by a similar project at the National Bureau of Standards on which an article had been published in August 1952.
To solve the stylus problem on the drive, Critchlow’s team attached a compressor to the unit that would pump a thin layer of air between the disk and the head. Later models would take advantage of a phenomenon known as the “boundry layer” in which the fast motion of the disks would generate the air cushion themselves. After experimenting with a variety of head types and positions throughout 1953 and 1954, the team was ready to complete a final design. Announced in 1956 as the Model 305 Disk Storage Unit and later renamed RAMAC (for Random Access Memory Accounting Machine), IBM’s first disk drive consisted of fifty 24-inch diameter aluminum disks rotating at 1200 rpm with a storage capacity of five million characters. Marketed as an add-on to the IBM 650, RAMAC revolutionized data processing by eliminating the time consuming process of manually sorting information and provided the first compelling reason for small and mid-sized firms to embrace computers and eliminate electro-mechanical tabulating equipment entirely.
The IBM 7090, the company’s first transistorized computer
In August 1958, IBM introduced its latest scientific computer, the IBM 709, which improved on the functionality of the IBM 704. The 709 continued to depend on vacuum tubes, however, even as competitors were starting to bring the first transistorized computers to market. While Tom Watson, Jr. and his director of engineering, Wally McDowell, were both excited by the possibilities of transistors from the moment they first learned about them and as early as 1950 charged Ralph Palmer’s Poughkeepsie laboratory to begin working with the devices, individual project managers continued to have the final authority in choosing what parts to use in their machines, and many of them continued to fall back on the more familiar vacuum tube. In the end, Tom Watson, Jr. had to issue a company-wide mandate in October 1957 that transistors were to be incorporated into all new projects. In the face of this resistance, Palmer felt that IBM needed a massive project to push its solid-state designs forward, something akin to what Project SAGE had done for IBM’s efforts with vacuum tubes and core memory. He therefore teamed with Steve Dunwell, who had spent part of 1953 and 1954 in Washington D.C. assessing government computing requirements, to propose a high-speed computer tailored to the ever-increasing computational needs of the military-industrial complex. A contract was eventually secured with the National Security Agency, and IBM approved “Project Stretch” in August 1955, which was formally established in January 1956 with Dunwell in charge.
Project Stretch experienced a long, difficult, and not completely successful development cycle, but it did achieve Palmer’s goals of greatly improving IBM’s solid-state capabilities, with particularly important innovations including a much faster core memory and a “drift transistor” that was faster than the surface-barrier transistor used in early solid-state computing projects like the TX-0. As work on Stretch dragged on, however, these advances were first introduced commercially through another product. In response to Sputnik, the United States Air Force quickly initiated a new Ballistic Missile Early Warning System (BMEWS) project that, like SAGE, would rely on a series of linked computers. The Air Force mandated, however, that these computers incorporate transistors, so Palmer offered to build a transistorized version of the 709 to meet the project’s needs. The resulting IBM 7090 Data Processing System, deployed in November 1959 as IBM’s first transistorized computer, provided a six-fold increase in performance over the 709 at only one-third additional cost. In 1962, an upgraded version dubbed the 7094 was released with a price of roughly $2 million. Both computers were well-received, and IBM sold several hundred of them.
Despite the success of its mainframe computer business, IBM in 1960 still derived the majority of its sales from the traditional punched-card business. While some larger organizations were drawn to the 702 and 705 business computers, their price kept them out of reach of the majority of IBM’s business customers. Some of these organizations had embraced the low-cost 650 as a data processing solution, leading to over 800 installations of the computer by 1958, but it was actually more expensive and less reliable than IBM’s mainline 407 electric accounting machine. The advent of the transistor, however, finally provided the opportunity for IBM to leave its tabulating business behind for good.
The impetus for a stored-program computer that could displace traditional tabulating machines initially came from Europe, where IBM did not sell its successful 407 due to import restrictions and high tooling costs. In 1952, a competitor called the French Bull Company introduced a new calculating machine, the Bull Gamma 3, that used delay-line memory to provide greater storage capacity at a cheaper price than IBM’s electronic calculators and could be joined with a card reader to create a faster accounting machine than anything IBM offered in the European market. Therefore, IBM’s French and German subsidiaries began lobbying for a new accounting machine to counter this threat. This led to the launch of two projects in the mid-1950s: the modular accounting calculator (MAC) development project in Poughkeepsie that birthed the 608 electronic calculator and the expensive and relatively unsuccessful 7070 transistorized computer, and the Worldwide Accounting Machine (WWAM) project run out of France and Germany to create an improved traditional accounting machine for the European market.
While the WWAM project had been initiated in Europe, it was soon reassigned to Endicott when the European divisions proved unable to come up with an accounting machine that could meet IBM’s cost targets. To solve this problem, Endicott engineer Francis Underwood proposed that a low-cost computer be developed instead. Management approved this concept in early 1958 under the name SPACE — for Stored Program Accounting and Calculating Equipment — and formally announced the product in October 1959 as the IBM 1401 Data Processing System. With a rental cost of only $2,500 a month (roughly equivalent to a purchase price of $150,000), the transitorized 1401 proved much faster and more reliable than an IBM 650 at a fraction of the cost and was only slightly more expensive than a mid-range 407 accounting machine setup. More importantly, it shipped with a new chain printer that could output 600 lines per minute, far more than the 150 lines per minute produced by the 407, which relied on obsolete prewar technology. First sold in 1960, IBM projected that it would sell roughly 1,000 1401 computers over its entire lifetime, but its combination of power and price proved irresistible, and by the end of 1961 over 2,000 machines had already been installed. IBM would eventually deploy 12,000 1401 computers before it was officially withdrawn in 1971. Powered by the success of the 1401, IBM’s computer sales finally equaled the sales of punch card products in 1962 and then quickly eclipsed them. No computer model had ever approached the success of the 1401 before, and as IBM rode the machine to complete dominance of the mainframe industry in the early 1960s, the powder-blue casing of the machine soon inspired a new nickname for the company: Big Blue.
The Honeywell 200, which competed with IBM’s 1401 and threatened to destroy its low-end business
In the wake of Remington Rand’s success with the UNIVAC I, more than a dozen old-line firms flocked to the new market. Companies like Monroe Calculating, Bendix, Royal, Underwood, and Philco rushed to provide computers to the business community, but one by one they fell by the wayside. Of these firms, Philco probably stood the best chance of being successful due to its invention of the surface barrier transistor, but while its Transac S-1000 — which began life in 1955 as an NSA project called SOLO to build a transistorized version of the UNIVAC 1103 — and S-2000 computers were both capable machines, the company ultimately decided it could not keep up with the fast pace of technological development and abandoned the market like all the rest. By 1960, only five established companies and one computer startup joined Sperry Rand in attempting to compete with IBM in the mainframe space. While none of these firms ever succeeded in stealing much market share from Big Blue, most of them found their own product niches and deployed some capable machines that ultimately forced IBM to rethink some of its core computer strategies.
Of the firms that challenged IBM, electronics giants GE and RCA were the largest, with revenues far exceeding the computer industry market leader, but in a way their size worked against them. Since neither computers nor office equipment were among either firm’s core competences, nor integral to either firm’s future success, they never fully committed to the business and therefore never experienced real success. Unsurprisingly, they were the first of the seven dwarfs to finally call it quits, with GE selling off its computer business in 1970 and RCA following suit in 1971. Burroughs and NCR, the companies that had long dominated the adding machine and cash register businesses respectively, both entered the market in 1956 after buying out a small startup firm — ElectroData and Computer Research Corporation respectively — and managed to remain relevant by creating computers specifically tailored to their preexisting core customers, the banking sector for Burroughs and the retail sector for NCR. Sperry Rand ended up serving niche markets as well after failing to compete effectively with IBM, experiencing success in fields such as airline reservation systems. The biggest threat to IBM’s dominance in this period came from two Minnesota companies: Honeywell and Control Data Corporation (CDC).
Unlike the majority of the companies that persisted in the computer industry, Honeywell came not from the office machine business, but from the electronic control industry. In 1883, a man named Albert Butz created a device called the “damper flapper” that would sense when a house was becoming cold and cause the flapper on a coal furnace to rise, thus fanning the flames and warming the house. Butz established a company that did business under a variety of names over the next few years to market his innovation, but he had no particular acumen for business. In 1891, William Sweatt took over the company and increased sales through door-to-door selling and direct marketing. In 1909 the company introduced the first controlled thermostat, sold as the “Minnesota Regulator,” and in 1912 Sweatt changed the name of the company to the Minnesota Heat Regulator Company. In 1927, a rival firm, Mark C. Honeywell’s Honeywell Heating Specialty Company of Wabash, Indiana, bought out Minnesota Heat Regulator to form the Honeywell-Minneapolis Regulator Company with Honeywell as President and Sweatt as chairman. The company continued to expand through acquisitions over the next decade and weathered the Great Depression relatively unscathed.
In 1941, Harold Sweatt, who had succeeded Honeywell as president in 1934, parlayed his company’s expertise in precision measuring devices into several lucrative contracts with the United States military, emerging from World War II as a major defense contractor. Therefore, the company was approached by fellow defense contractor Raytheon to establish a joint computer subsidiary in 1954. Incorporated as Datamatic Corporation the next year, the computer company became a wholly-owned subsidiary of Honeywell in 1957 when Raytheon followed so many other companies in exiting the computer industry. Honeywell delivered its first mainframe, the Datamatic 1000, that same year, but the computer relied on vacuum tubes and was therefore already obsolete by the time it hit the market. Honeywell temporarily withdrew from the business and went back to the drawing board. After IBM debuted the 1401, Honeywell triumphantly returned to the business with the H200, which not only took advantage of the latest technology to outperform the 1401 at a comparable price, but also sported full compatibility with IBM’s wildly successful machine, meaning companies could transfer their existing 1401 programs without needing to make any adjustments. Announced in 1963, the H200 threatened IBM’s control of the low-end of the mainframe market.
William Norris (l) and Seymour Cray, the principle architects of the Control Data Corporation
While Honeywell chipped away at IBM from the bottom of the market, computer startup Control Data Corporation (CDC) — the brainchild of William Norris — threatened to do the same from the top. Born in Red Cloud, Nebraska, and raised on a farm, Norris became an electronics enthusiast at an early age, building mail-order radio kits and becoming a ham radio operator. After graduating from the University of Nebraska in 1932 with a degree in electrical engineering, Norris was forced to work on the family farm for two years due to a lack of jobs during the Depression before joining Westinghouse in 1934 to work in the sales department of the company’s x-ray division. Norris began doing work for the Navy’s Bureau of Ordinance as a civilian in 1940 and enjoyed the work so much that he joined the Naval Reserve and was called to duty at the end of 1941 at the rank of lieutenant commander. Norris served as part of the CSAW codebreaking operation and became one of the principle advocates for and co-founders of Engineering Research Associates after the war. By 1957, Norris was feeling stifled by the corporate environment at ERA parent company Sperry Rand, so he left to establish CDC in St. Paul, Minnesota.
Norris provided the business acumen at CDC, but the company’s technical genius was a fellow engineer named Seymour Cray. Born in Chippewa Falls, Wisconsin, Cray entered the Navy directly after graduating from high school in 1943, serving first as a radio operator in Europe before being transferred to the Pacific theater to participate in code-breaking activities. After the war, Cray attended the University of Minnesota, graduated with an electrical engineering degree in 1949, and went to work for ERA in 1951. Cray immediately made his mark by leading the design of the UNIVAC 1103, one of the first commercially successful scientific computers, and soon gained a reputation as an engineering genius able to create simple, yet fast computer designs. In 1957, Cray and several other engineers followed Norris to CDC.
Unlike some of the more conservative engineers at IBM, Cray understood the significance of the transistor immediately and worked to quickly incorporate it into his computer designs. The result was CDC’s first computer, the 1604, which was first sold in 1960 and significantly outperformed IBM’s scientific computers. Armed with Cray’s expertise in computer design Norris decided to concentrate on building the fastest computers possible and selling them to the scientific and military-industrial communities where IBM’s sales force exerted relatively little influence. As IBM’s Project Stretch floundered — never meeting its performance targets after being released as the IBM 7030 in 1961 — Cray moved forward with his plans to build the fastest computer yet designed. Released as the CDC 6600 in 1964, Cray’s machine could perform an astounding three million operations per second, three times as many as the 7030 and more than any other machine would be able to perform until 1969, when another CDC machine, the 7600, outpaced it. Dubbed a supercomputer, the 6600 became the flagship product of a series of high-speed scientific computers that IBM proved unable to match. While Big Blue was ultimately forced to cede the top of the market to CDC, however, by the time the 6600 launched the company was in the final phases of a product line that would extend the company’s dominance over the mainframe business and ensure competitors like CDC and Honeywell would be limited to only niche markets.
The System/360 family of computers, which extended IBM’s dominance of the mainframe market through the end of the 1960s.
When Tom Watson Jr. finally assumed full control of IBM from his father, he inherited a corporate structure designed to collect as much power and authority in the hands of the CEO as possible. Unlike Watson Sr., Watson Jr. preferred decentralized management with a small circle of trusted subordinates granted the authority to oversee the day-to-day operation of IBM’s diverse business activities. Therefore Watson overhauled the company in November 1956, paring down the number of executives reporting directly to him from seventeen to just five, each of whom oversaw multiple divisions with the new title of “group executive.” He also formed a Corporate Management Committee consisting of himself and the five group executives to make and execute high-level decisions. While the responsibilities of individual group executives would change from time to time, this new management structure remained intact for decades.
Foremost among Watson’s new group executives was a vice president named Vin Learson. A native of Boston, Massachusettes, T. Vincent Learson graduated from Harvard with a degree in mathematics in 1935 and joined IBM as a salesman, where he quickly distinguished himself. In 1949, Learson was named sales manager of IBM’s Electric Accounting Machine (EAM) Division, and he rose to general sales manager in 1953. In April 1954, Tom Watson, Jr. named Learson the director of Electronic Data Processing Machines with a mandate to solidify IBM’s new electronic computer business. After guiding early sales of the 702 computer and establishing an advanced technology group to incorporate core memory and other improvements into the 704 and 705 computers, Learson received another promotion to vice president of sales for the entire company before the end of the year. During Watson’s 1956 reorganization, he named Learson group executive of the Military Products, Time Equipment, and Special Engineering Products divisions.
During the reorganization, IBM’s entire computer business fell under the new Data Processing Division overseen by group executive L.H. LaMotte. As IBM’s computer business continued to grow and diversify in the late 1950s, however, it grew too large and unwieldy to contain within a single division, so in 1959 Watson split the operation in two by creating the Data Systems Division in Poughkeepsie, responsible for large systems, and the General Products Division, which took charge of small systems like the 650 and 1401 and incorporated IBM’s other laboratories in Endicott, San Jose, Burlington, Vermot, and Rochester, Minnesota. Watson then placed these two divisions, along with a new Advanced Systems Development Division, under Learson’s control, believing him to be the only executive capable of propelling IBM’s computer business forward.
Vin Learson, the IBM executive who spearheaded the development of the System/360
When Learson inherited the Data Systems and General Products Divisions, he was thrust into the middle of an all out war for control of IBM’s computer business. The Poughkeepsie Laboratory had been established specifically to exploit electronics after World War II and prided itself on being at the cutting edge of IBM’s technology. The Endicott Laboratory, the oldest R&D division at the company, had often been looked down upon for clinging to older technology, yet by producing both the 650 and the 1401, Endicott was responsible for the majority of IBM’s success in the computer realm. By 1960, both divisions were looking to update their product lines with more advanced machines. That September, Endicott announced the 1410, an update to the 1401 that maintained backwards compatibility. At the same time, Poughkeepsie was hard at work on a new series of four compatible machines designed to serve a variety of business and scientific customers under the 8000 series designation. Learson, however, wanted to unify the product line from the very low end represented by the 1401 to the extreme high end represented by the 7030 and the forthcoming 8000 computers. By achieving full compatibility in this manner, IBM could take advantage of economies of scale to drive down the price of individual computer components and software development while also standardizing peripheral devices and streamlining the sales and service organizations that would no longer have to learn multiple systems. While Learson’s plan was sound in theory, however, forcing two organizations that prided themselves on their independence and competed with each other fiercely to work together would not be easy.
Learson relied heavily on his power as a group executive to transfer employees across both divisions to achieve project unity. First, he moved Bob Evans, who had been the engineering manager for the 1401 and 1410, from Endicott to Poughkeepsie as the group’s new systems development manager. Already a big proponent of compatibility, Evans unsurprisingly recommended that the 8000 project be cancelled and a cohesive product line spanning both divisions be initiated in its place. The lead designer of the 8000 series, Frederick Brooks, vigorously opposed this move, so Learson replaced Brooks’s boss with another ally, Jerrier Haddad, who had led the design of the 701 and recently served as the head of Advanced Systems Development. Haddad sided with Evans and terminated the 8000 project in May 1961. Strong resistance remained in some circles, however, most notably from General Products Division head John Haanstra, so in October 1961, Learson assembled a task group called SPREAD (Systems, Planning, Review, Engineering, and Development) consisting of thirteen senior engineering and marketing managers to determine a long-term strategy for IBM’s data processing line.
On December 28, the SPREAD group delivered its final proposal to the executive management committee. In it, they outlined a series of five compatible processors representing a 200-fold range in performance. Rather than incorporate the new integrated circuit, the group proposed a proprietary IBM design called Solid Logic Technology (SLT), in which the discrete components of the circuit were mounted on a single ceramic substrate, but were not fully integrated. By combining the five processors with SLT circuits and core memories of varying speeds, nineteen computer configurations would be possible that would all be fully compatible and interchangeable and could be hooked up to 40 different peripheral devices. Furthermore, after surveying the needs of business and scientific customers, the SPREAD group realized that other than floating-point capability for scientific calculations, the needs of both customers were nearly identical, so they chose to unify the scientific and business lines rather then market different models for each. Codenamed the New Product Line (NPL), the SPREAD proposal would allow IBM customers to buy a computer that met their current needs and then easily upgrade or swap components as their needs changed over time at a fraction of the cost of a new system without having to rewrite all their software or replace their peripheral devices. While not everyone was convinced by the presentation, Watson ultimately authorized the NPL project.
The NPL project was perhaps the largest civilian R&D operation ever undertaken to that point. Development costs alone were $500 million, and when tooling, manufacturing, and other expenses were taken into account, the cost was far higher. Design of the five processor models was spread over three facilities, with Poughkeepsie developing the three high-end systems, Endicott developing the lowest-end system, and a facility in Hursley, England, developing the other system. At the time, IBM manufactured all its own components as well, so additional facilities were charged with churning out SLT circuits, core memories, and storage systems. To assemble all the systems, IBM invested in six new factories. In all, IBM spent nearly $5 billion to bring the NPL to market.
To facilitate the completion of the project, Watson elevated two executives to new high level positions: Vin Learson assumed the new role of senior vice president of sales, and Watson’s younger brother, Arthur, who for years had run IBM’s international arm, the World Trade Corporation, was named senior vice president of research, development, and manufacturing. This new role was intended to groom the younger Watson to assume the presidency of IBM one day, but the magnitude of the NPL project coupled with Watson’s inexperience in R&D and manufacturing ultimately overwhelmed him. As the project fell further and further behind schedule, Learson ultimately had to replace Arthur Watson in order to see the project through to completion. Therefore, it was Learson who assumed the presidency of IBM in 1966 while Watson assumed the new and largely honorary role of vice chairman. His failure to shepherd the NPL project ended any hope Arthur Watson had of continuing the Watson family legacy of running IBM, and he ultimately left the company in 1970 to serve as the United States ambassador to France.
In late 1963, IBM began planning the announcement of its new product line, which now went by the the name System/360 — a name chosen because it represented all the points of a compass and emphasized that the product line would fill the needs of all computer users. Even at this late date, however, acceptance of System/360 within IBM was not assured. John Haanstra continued to push for an SLT upgrade to the existing 1401 line to satisfy low-end users, which other managers feared would serve to perpetuate the incompatibility problem plaguing IBM’s existing product line. Furthermore, IBM executives struggled over whether to announce all the models at once and thus risk a significant drop in orders for older systems during the transition period, or phase in each model over the course of several years. All debate ended when Honeywell announced the H200. Faced with losing customers to more advanced computers fully compatible with IBM’s existing line, Watson decided in March 1964 to scrap the improved 1401 and launch the entire 360 product line at once.
On April 7, 1964, IBM held press conferences in sixty-three cities across fourteen countries to announce the System/360 to the world. Demand soon far exceeded supply as within the first two years that System/360 was on the market IBM was only able to fill roughly 4,500 of 9,000 orders. Headcount at the company rose rapidly as IBM rushed to bring new factories online in response. In 1965, when actual shipments of the System/360 were just beginning, IBM controlled 65 percent of the computer market and had revenues of $2.5 billion. By 1967, as IBM ramped up to meet insatiable 360 demand, the company employed nearly a quarter of a million people and raked in $5 billion in revenues. By 1970, IBM had an install base of 35,000 computers and held an ironclad grip on the mainframe industry with a marketshare between seventy and eighty percent; the next year company earnings surpassed $1 billion for the first time.
As batch processing mainframes, the System/360 line and its competitors did not serve as computer game platforms or introduce technology that brought the world closer to a viable video game industry. System/360 did, however, firmly establish the computer within corporate America and solidified IBM’s place as a computing superpower while facilitating the continuing spread of computing resources and the evolution of computer technology. Ultimately, this process would culminate in a commercial video game industry in the early 1970s.