A Brief History of Enterprise Computing — Part 2
In Part 1 of this three-part series, I focused on early systems and technologies (1880–1970).
In Part 2, I focus on major advances in computing (1970 -1995) including:
- The rise of an independent software industry
- Minicomputers and the personal computer (PC)
- The graphical user interface (GUI)
- PCs as “thick” clients
The rise of an independent software industry
Initially, programming a computer meant expressing the program at the lowest level using machine language. Machine languages are generally far removed from how logical or mathematical problems are typically represented. Thus, programming was a highly specialized endeavor. To make programming more widely accessible, higher-level languages (HLLs) were necessary. HLLs could then be translated (compiled) by the computer into machine language.
In 1954, IBM invented FORTRAN — the first widely-used general-purpose programming language. COBOL (common business-oriented language), developed by a consortium, followed in 1959. Most first-generation business applications were written in either FORTRAN or COBOL. Programs written in these languages persist today, mainly as legacy applications.
Early business applications were custom-written for each customer’s needs. IBM held a tight grip on software as they bundled it as part of their hardware sales. In the early 1970s, under anti-trust pressure, IBM unbundled software from its hardware offerings, thus paving the way for an independent software industry.
As the demand for computers grew, there was increasing demand for software applications. Firms specializing in software contracting met this initial demand. During the 1970s, firms transitioned from supplying programming services to providing software products. These included applications such as payroll processing, accounting, and human resources management.
Minicomputers and the personal computer (PC)
The dominance of mainframes eventually yielded to smaller, less expensive, increasingly powerful minicomputers. Minicomputers first emerged in the late 1950s but peaked in the 1970s. Their relatively low cost compared to mainframes made them accessible to more organizations.
Image source: Wikipedia
The popularity of minicomputers, however, quickly declined with the advent of the personal computer (PC) — introduced by IBM in 1981. From a technical architecture standpoint, the IBM PC is a microcomputer. Whereas the minicomputer’s central processing unit (CPU) consisted of several components, microcomputer CPUs used a single integrated microprocessor chip.
Intel invented the microprocessor in 1971, and microcomputers existed before the IBM PC. Early microcomputers were primarily targeted at computer hobbyists and programmers, and many were available as DIY kits. However, manufacturers soon started producing microcomputers targeted more broadly at the home market. These included the Apple II, the Commodore PET, and the TRS-80 — all of which were introduced in 1977.
Image source: Wikipedia
The “1977 trinity” of microcomputers took a turn toward the business market with the introduction of VisiCalc (Visible Calculator), the first spreadsheet program for microcomputers. Initially released for the Apple II in 1979, it was subsequently ported to the Commodore PET and TRS-80. VisiCalc was a killer app: people bought computers just so they could run it.
Seeing the incursion of VisiCalc and microcomputers into the business market, IBM took action. On August 12, 1981, IBM introduced its own microcomputer: the Model 5150. To differentiate this computer more clearly from its other offerings, it promoted it as a “personal computer.” While it sold well in the home market, it was a much bigger hit in the business market. By 1984, IBM had sold millions of units.
Image source: Wikipedia
Famously, the IBM Model 5150 ran on a combination of an Intel 8088 microprocessor and an operating system provided by Microsoft (MS-DOS, which would evolve into Windows). IBM designed the PC with an open architecture — which led to the development of compatible software and “IBM-compatible” hardware. The real winners in this arrangement were Intel and Microsoft, who had virtual monopolies in the PC market as the “Wintel standard” took hold.
The graphical user interface (GUI)
Early PCs were command-line driven, meaning users issued commands using keystrokes on the keyboard (there was no mouse).
Image source: Wikipedia
The command-line interface (CLI) lives on today and has advantages for some computing tasks (such as computer programming). Even if you’ve never used a CLI, you might be familiar with it as the Windows command prompt or the macOS terminal.
The disadvantage of the CLI is that it can have a steep learning curve: users must type commands exactly and, to work efficiently, memorize commands. Therefore, a major step forward in the usability and learnability of PCs was the development of the graphical user interface (GUI).
In the early 1970s, researchers at Xerox PARC (Palo Alto Research Center) produced the experimental Alto workstation, which leveraged earlier explorations into using a graphical interface. Xerox produced about 2,000 Alto workstations, which were used primarily within Xerox, with about 500 units in use at universities.
The Alto paved the way for the Xerox Star (released in 1981), the first commercially-available GUI-based computer. While the Xerox Star did not sell well, its interface was a leap forward from the Alto. Xerox Star’s use of windows, folders, and a trash can are all familiar to us today.
Image source: PC Magazine
This 1982 video demonstrates the Xerox Star user interface.
Meanwhile, in 1976, Steve Jobs and Steve Wozniak founded Apple Computer. Their first commercially successful product was the Apple II, released in 1977. In 1979, as Apple was developing its next-generation computer (the Lisa), Steve Jobs visited Xerox PARC and saw the GUI work in progress. Excited about the possibilities, he eventually negotiated a deal that involved the Lisa engineering team receiving demonstrations of the work at Xerox PARC. When Apple introduced the Lisa in early 1983, it had a fully graphical user interface heavily influenced by the work done at Xerox PARC. However, the Lisa also introduced its own innovations, including the ability to drag and drop files and a double-click interaction with a mouse.
The Lisa, however, had not been an easy project at Apple. It was technically complex from both a hardware and software perspective, and cost-cutting measures resulted in the slow performance of its interface. Even then, it cost $9,995 — close to around $30,000 in 2023. The Lisa did not sell well and was quickly eclipsed by another project at Apple — the Macintosh.
Development of the Macintosh ran parallel to work on the Lisa. Unlike the Lisa, Apple intended the Macintosh to be a computer for the masses, using the Lisa’s graphical interface but in a much more affordable computer. When released in 1984, the Macintosh was revolutionary, and its interface was to become the standard-bearer for future computer generations.
Image source: Wikipedia
Apple intended the Macintosh to substantially challenge IBM’s then-dominant position in the personal computer market. However, ultimately IBM would not dominate the PC market; it would be the Wintel standard forged by the alliance between Intel and Microsoft.
Eventually, Microsoft would adopt a graphical user interface, though it took a couple of attempts (neither Windows 1.0 or Windows 2.0 were successful). In 1990, Microsoft released Windows 3.0, which provided an experience much more on a par with the Macintosh. Windows 3.0 was a hit for Microsoft with millions of copies sold.
Image source: Wikipedia
Though Apple sued Microsoft for infringement over Windows’ use of a graphical user interface, they lost the case, and Microsoft went on to dominate the market for GUI-based PCs with Windows.
The impact of the GUI on enterprise computing is hard to overestimate. It ushered in the era of modern enterprise applications, including manufacturing resource planning (MRP), enterprise resource planning (ERP), customer relationship management (CRM), and human resources (HR). These applications were installed on PCs but still relied on a larger centralized system for data storage and big processing jobs — using a type of client-server architecture.
PCs as “thick” clients
Early enterprise software ran on a mainframe server, with users accessing the software using so-called dumb terminals (terminals with minimal processing power). Recall the SABRE airline reservation system. First deployed in 1960, it consisted of a set of agent stations connected via phone lines to mainframe systems located in New York, which ran the SABRE software and did all the processing. Dumb terminal architecture dominated until the 1980s when organizations began deploying PCs to use early word processors and spreadsheets.
Initially, dumb terminals and PCs co-existed side-by-side. However, the ascendency of PCs led to a client-server architecture where PCs served as “thick” clients providing additional processing power using application software installed on each machine. The PCs communicated as needed with the server to retrieve data or offload larger data processing jobs. Throughout the 1990s, enterprise systems were deployed using this type of client-server architecture. Some of these applications live on even today in organizations, though usually as older legacy applications needing retirement.
Image source: ERP Focus
As the 1990s came to a close, a new technical development was to change computing forever: the Internet.