The History of Cloud Computing

Before the advent of cloud computing, the prevalent model was Client/Server computing, where software applications, data and controls were centrally stored on servers. Users had to connect to the server to access specific data or run programs.

Subsequently, distributed computing emerged, connecting computers in a network to share resources as required. It laid the foundation for the development of cloud computing concepts.

Invention of Cloud Computing:

While J.C.R. Licklider made significant contributions to the development of computer networking and had a vision for connecting people and data worldwide, he did not specifically invent cloud computing as we know it today. The concept of cloud computing evolved over time, and multiple individuals and organizations played a role in its development. Licklider’s work on ARPANET and his ideas about networked computing certainly influenced the evolution of cloud computing, but he cannot be solely credited with its invention.

In 1961, John MacCharty proposed the idea of selling computing as a utility, like water or electricity, during his speech at MIT. However, despite interest in the concept, the technology was not yet ready for implementation.

Over time, technology advanced and in 1999, pioneered the delivery of applications to users through a simple website, effectively realizing the vision of computing as a utility for enterprises over the Internet.

In 2002, Amazon introduced Amazon Web Services, offering services such as storage, computation and human intelligence. However, it was with the launch of Elastic Compute Cloud in 2006 that a truly accessible commercial service became available to all.

In 2009, Google Apps entered the market, providing cloud computing enterprise applications. Major players like Microsoft, Oracle, and HP joined the cloud computing evolution, with Microsoft launching Windows Azure in 2009. Today, cloud computing has become mainstream.

Why Cloud Computing was Invented?

Cloud computing was invented to address the need for efficient and shared computer resources. In 1963, the Defense Advanced Research Project Agency (DARPA) provided funding for a project focused on developing technology that would enable multiple users to simultaneously access and use a computer. This project utilized large computers and magnetic tape for memory, which can be seen as a precursor to the concept of cloud computing. It acted as a centralized system, allowing up to three people to connect and access resources.

J.C.R. Licklider further expanded upon this concept with the development of ARPANET, an early version of the internet. Licklider envisioned an interconnected global network called the Intergalactic Computer Network in 1969 successfully, where individuals from around the world could access information and connect through computers. Although the term “cloud computing” was not yet coined, Licklider’s vision laid the groundwork for the interconnected and accessible computing infrastructure we now associate with the cloud.

The term “virtualization” emerged in the 1970s, describing the creation of virtual machines that mimic fully functional computer systems. The increased use of virtual computers in the 1990s, along with businesses offering virtual services, played a significant role in the development of the cloud computing infrastructure.

Overall, cloud computing was developed to facilitate efficient resource sharing, global connectivity and the scalability of computer systems to meet the demands of an interconnected world.

Leave a Comment