The Era of Cloud Computing: An Interactive Timeline
The Era of Cloud Computing: An Interactive Timeline
The term “cloud computing” may be commonly uttered in businesses all over the world, but its complex history is less well known. Our interactive timeline explores the history of Cloud Computing (and its future!)
Considering the technology has only achieved mainstream adoption in the last decade or so, it may come as some surprise to discover that the origins of the cloud are relatively long-standing. Far from being a newly discovered technological gimmick, cloud computing can trace its roots back to the 1940s and 50s, developing alongside the personal computer and the World Wide Web.
If you have ever accessed a file or application that was not stored locally on your smartphone, tablet or PC then you’ve benefited from the cloud. But while this technology has become ubiquitous today, it is worth remembering that computers themselves have not always been easily available. It may sound obvious, but for cloud computing to achieve popularity, computers themselves had to enter the mainstream.
The birth of modern computing
It wasn’t until 1947 that the first computer capable of storing a program was created by researchers at Manchester University. The “Manchester Baby,” as it was dubbed, may have been overly cumbersome with a length of 17 feet and limited in its functionality, but it played a pivotal role in the history of computers and software.
To find out more about the era of cloud computing, head over to our Interactive Timeline.
Moving into the 1950s, the concept of “time sharing” bears more than a passing resemblance to cloud computing, despite the latter term not coming into existence for another 40 years. In the early days of computing, devices were large and expensive, meaning any downtime was a huge waste of resources. Time sharing was a concept developed by academics and researchers that allocated users an amount of processing power to ensure computers were used more efficiently. The idea of sharing resources is the same one that lies at the heart of cloud computing today.
Hardware advances meant that computers became smaller and more affordable over time, with the Apple II computer launched in 1977 and going on to sell between five and six million units over its lifetime. The launch of the first Windows operating system in 1985 also helped to demonstrate the many benefits that computers could provide for both consumers and businesses, but in order for cloud computing to take off, network technology had to achieve a similar level of progress.
Networks go global
This begun in the early 1960s with the discovery of “packet switching,” which allowed more than two people to use the same network at any given time. This in turn led to the creation of the ARPANET in 1969 which connected multiple computers together and became the first large-scale packet switching network.
When this was followed up by the work of Time Berners Lee in 1991, the World Wide Web was created and the foundations for cloud computing were in place. The Dot Com Bubble Burst and the growth of digital businesses all helped accelerate cloud adoption until it became the multi-billion dollar industry of today.
The future of Cloud Computing
The likes of Amazon, Google, Microsoft and Netflix all rely on cloud computing to deliver their services and with countless untapped benefits yet to be discovered, the future remains bright for cloud computing.
To find out more about the era of cloud computing, head over to our Interactive Timeline.