A Brief History of Enterprise Computing — Part 3

Heidi Adkisson
7 min readMar 7, 2023

In Part 1 of this three-part series on the history of enterprise computing, I focused on early developments (1880–1970).

In Part 2, I focused on major advancement in computing (1970 -1995).

In Part 3, I focus on the modern computing environment (1995 — present) including:

  • The internet
  • The world wide web
  • Software as a service (SaaS)
  • Smartphones and bring your own device (BYOD) culture
  • The consumerization of IT
  • Internet of Things (IoT)
  • Artificial Intelligence (AI)
  • Mainframes today

The internet

Organizations had been networking computers together since the late 1950s, when AT&T released the first commercial modem, allowing digital data to be transmitted over regular telephone lines. The telephone system uses circuit switching, where two nodes on the network establish a dedicated communications channel (circuit) for the duration of the communication session.

In the 1960s, researchers began exploring other, more efficient ways of connecting computers. ARPA (the Defense Department’s Advanced Research Projects Agency) became particularly interested in this research. The agency funded the development of ARPANET, which used packet switching rather than circuit switching. Packet switching does not require a dedicated channel: instead, messages are broken down into smaller data packets that seek the most efficient route (packets need not all follow the same route). Once all the packets are received at the destination, the packets are re-assembled into the original message. ARPANET was launched as an experimental network in 1969, connecting research institutions so that they could more easily share information.

A map showing the location of ARPANET access points (concentrated on the west and east coasts).
ARPENET access points (1970s)

Image source: Wikipedia

By the late 1970s, ARPANET had matured beyond its experimental state, and additional networks had emerged, including NSFNET (National Science Foundation Network). The problem is that none of these networks could talk to each other. As a result, networks eventually moved to a single protocol: TCP/IP (Transmission-Control Protocol/Internet Protocol), which continues to undergird the internet. NSFNET eventually became the backbone of the internet, and ARPANET officially shut down in 1989.

The pre-web internet was a text-based world — the domain of primarily technical users relying on functionality such as Usenet for sharing information, FTP for sharing files, IRC (Internet Relay Chat) for real-time communication, and email. The world wide web forever changed the internet by allowing documents to be shared in an interconnected way.

The world wide web

Development of the world wide web began in 1989 when Tim Berners-Lee and colleagues at CERN (an international scientific organization based in Geneva, Switzerland) were looking for a better way to share data, news, and documents over the internet. Berners-Lee is credited with developing the underlying technologies to accomplish this goal: a web server, a web browser, and a document formatting protocol called Hypertext Markup Language (HTML). Web pages formatted in HTML had the then-unique characteristic of combining text, graphics, and hyperlinks. In 1993, Marc Andreessen developed the Mosaic web browser which was both easy to use and could be installed on PCs by average, everyday users.

The Mosaic web brower displaying an early web page.
Mosaic web browser (1994)

Image source: Wikipedia

Complete histories of the web are widely available, but suffice it to say adoption of the web as a global, interconnected network presented previously unimagined possibilities. The technology running the web steadily improved. In the mid-2000s, the emergence of “Web 2.0” moved the web beyond static pages and into an experience that was more interactive and responsive, allowing the development of web-based applications.

Software as a service (SaaS)

Web-based applications led to a new way of delivering software to organizations: Software as a Service (SaaS). SasS, in many ways, is a “back to the future” architecture of centralized computing. Applications are centrally hosted and accessed by users with a web browser functioning as a thin client.

Salesforce was a first-mover in SaaS, providing the type of Customer Relationship Management (CRM) system that previously was only available to organizations as a large, complex, “on-premises” installation. A significant advantage of SaaS is that the software is instantly available to users — no physical installation of software is required by customers. Other enterprise software vendors soon followed with SaaS offerings, and today nearly all vendors provide their products in some form of SaaS.

An “end of software” mascot surrounding by people pretending to protest traditional enterprise software.
Salesforce’s famous “end of software” campaign promoted the advantages of SaaS.

Image source: Business Insider

SaaS architecture also challenged the sales model for enterprise applications. Traditionally, enterprise software vendors sold their systems to corporate IT executives, who were usually far removed from those who would actually use the application. As a result, vendors had little or no incentive to focus on user experience or actual user needs. This arrangement led to the well-deserved terrible reputation many enterprise applications had among users. SaaS start-ups took a different approach — focusing on simplicity, ease of use, and offering trials directly to users. The goal was to get bottom-up momentum in an organization that would ultimately pressure (or otherwise convince) IT executives to implement. Alternatively, these systems were sometimes purchased and operated outside of corporate IT’s control and visibility — essentially creating a “shadow IT” in an organization. Today, most large organizations operate with some level of shadow IT (though it is typically discouraged).

Smartphones and bring your own device (BYOD) culture

The early to mid-2000s were a time of seismic change in enterprise computing. In addition to the rise of SaaS, smartphones provided functionality previously only available from a PC, including access to the web and email. The introduction of the iPhone in 2007 provided a landmark touch-based mobile user experience.

As smartphones gained traction with consumers, these same consumers were bringing (and using) their phones into the workplace. Smartphones in the workplace were a trend that corporate IT departments eventually had to embrace.

Some organizations let employees use their personal phones for work, allowing them to connect devices to corporate networks. This “bring your own device” (BYOD) ethic is popular with employees and has cost and efficiency advantages for an organization, but it also poses potential security risks. Some organizations issued work-dedicated smartphones to their employees, keeping the devices under direct control. BYOD management systems provide a middle ground: employees can use their personal devices, but the organization can lock or wipe misplaced devices and manage work-related apps.

Today, mobile devices — and enterprise mobile apps — are are ubiquitous in corporate computing.

Consumerization of IT

A significant trend emerging around 2010 was the consumerization of IT.

Historically, IT departments had an iron grip on corporate computing environments. With the twin developments of SaaS solutions marketed directly at users and people using their personal mobile devices for work purposes, IT no longer has the control it once had.

Mobile apps and newer SaaS solutions also raised employee expectations for the systems they had to use on the job. Younger employees in particular were often shocked and dismayed by the experience traditional enterprise applications had to offer. Vendors of enterprise systems were put on notice: they needed to appeal to the users of their systems, not just the purchasing executives.

Internet of Things (IoT)

Internet of Things (IoT) is a term used to describe “smart” network-connected devices. The term, however, is a bit of a misnomer because IoT devices operate over any network, not just the public internet. IoT includes many familiar consumer products such as smartwatches (wearables) and client control systems (home automation). In the enterprise, IoT devices include remote sensors and monitoring devices found in healthcare, agriculture, manufacturing, retail, and industrial settings (among others).

A diagram showing how a smart meter at a home communicates over a network to deliver data both to the ultility and to consumers.
Utilities’ advanced metering infrastructure (AMI) relies on IoT devices (smart meters)

Image source: City of Lenoir, NC

Artificial Intelligence (AI)

No technical development since the internet is changing enterprise computing more than artificial intelligence (AI). In simplest terms, AI enables problem-solving, generally associated with humans, using computer science and large data sets.

IA capabilities are embedded in a wide range of enterprise systems. Some applications are consumer-facing (such as e-commerce recommendation engines and customer service chatbots), while others focus on internal operations. For example, IA may analyze vast volumes of log data to surface and automatically resolve impending problems in a data center.

Mainframes today

The history of enterprise computing begins with the mainframe. However, lest you think mainframes are a dinosaur of the past, the reality is that many large organizations still run mainframes as part of their technical infrastructure. Modern mainframes are indeed produced and sold.

A promotional photo of IBM’s z16 mainframe computer.
IBM z16 Mainframe (2023)

Image source: IBM

The redundant engineering of mainframes makes them highly reliable and secure, more so than other server options. They are especially suited to mission-critical applications where downtime is unacceptable. Banking, insurance, healthcare, government, and aviation all continue to rely on mainframes.

Mainframes, however, have a dark side: they may maintain decades-long backward capability with older applications. With slight modifications, IBM’s model z-series of mainframes can still run applications originally written for System/360 (introduced in 1964). This depth of backward compatibility allows organizations to run applications long after they should have been retired and replaced.

Recall that in December 2022, Southwest Airlines experienced a crew scheduling meltdown, requiring the cancelation of over 16,000 flights over five days. The disaster was attributed to crew scheduling software that Southwest should have modernized years ago.

Conclusion

In this brief history of enterprise computing, I’ve tried to draw a line from the earliest methods to automate computation to today’s complex technical environment. For a product manager or designer working in the enterprise, it’s helpful to understand this history and the core ideas that undergird what we know today. It’s also essential to constantly look forward, as new developments, particularly AI, will shape enterprise systems and how users interact with them.

--

--

Heidi Adkisson

Principal UX Designer • Crafting better enterprise experiences since 1988