Thinking of the cloud, we usually refer only to the modern concept. After all, it’s really just been over the past decade or so that cloud computing really started to develop into the giant, irreplaceable and all-powerful phenomenon we currently talk about. However, it’s always interesting to recall where and how it all started.
However, the thing is that concepts of the cloud have existed for many years, and in fact can be tracked down as far back as the 1950s with mainframe computing. In those early days, mainframe computers were huge machines, and too expensive – too expensive to buy and maintain one for every single employee. And of course, not every single employee needed access to one at all times like they do today. So, most organizations used to purchase just one or two machines, and then implement “time-sharing” schedules which enabled multiple users to access the central mainframe computer from connected stations. These stations were known as “dumb terminals”, and provided no processing power of their own. Even so, this type of shared computational power is the basic, underlying prerequisite of cloud computing.
In the mid-1960s, a major advancement in cloud computing came when American computer scientist J.C.R. Licklider conceptualized an interconnected system of computers. In 1969, “Lick”, as he is often known, helped develop a very primitive version of the Internet, known as the Advanced Research Projects Agency Network (ARPANET). ARPANET was the first network that allowed digital sources to be shared among computers that were not in the same physical location. Lick’s vision was also for a world where everyone would be interconnected by means of computers and able to access information from anywhere. Sound familiar? Of course it does – it’s the Internet as we know it, and a necessity for accessing all the benefits that the cloud implements. Over the decades that followed, many further developments in cloud technology came into existence. In 1972, for example, IBM released an operating system (OS) called the Virtual Machine (VM) operating system. Virtualization describes a virtual computer that acts just like a real one, with a fully-operational OS. The concept evolved with the Internet, and businesses began offering “virtual” private networks as a rentable service, eventually leading to the development of the modern cloud computing infrastructure in the 1990s.
Also in this decade, telecommunications companies began offering virtualized private networks, which had the same service quality as their dedicated point-to-point data connections at a reduced cost. Instead of building out physical infrastructure to allow for more users to have their own connections, telecommunications companies were now able to provide users with shared access to the same physical infrastructure.
In the early 2000s, Amazon Web Services (AWS) emerged, and Amazon launched Elastic Compute Cloud (EC2) in 2006, allowing companies and individuals to rent virtual computers through which they could use their own programs and applications. In the same year, Google launched its Google Docs services, allowing users to save, edit and transfer documents in the cloud.
In 2007, IBM, Google, and several universities joined forces to develop a server farm for research projects. It was also the year that Netflix launched its video streaming service, using the cloud to stream movies and other video content into the homes and onto the computers of thousands (and eventually millions) of subscribers worldwide. Nevertheless, the crucial concern now is what the future of cloud computing entails? This post lists a few tendencies you can expect to see from cloud-based computing in the near future.
Security and Compliance Will Be Critical
As more and more businesses migrate their services and functions to the cloud, security and regulation compliance will become an increasing concern. Hackers go where the data goes, and as more data is pushed to the cloud, so too will the hackers trying to breach the system and steal valuable databases to sell.
As we know, in May 2018, we saw the EU’s General Data Protection Regulation (GDPR) come into effect, which of course has implications for all global enterprises. Cloud computing compliance under the GDPR is not easy, and many organizations are not prepared. In fact, a recent survey from Commvault revealed that only 12% of global IT organizations understand how GDPR will affect their cloud services. And GDPR is likely only the beginning – as governments around the globe start recognizing the risks, cloud computing will no doubt start to become highly regulated.
Next year, you can expect to see the concept of “Internet of Things” (IoT) becoming more prevalent. Definitely, more of your home devices are bound to be connected to the internet. For instance, your fridge or dishwasher might have the capability to connect online. This means you’ll be able to check how full your fridge is while you’re shopping. Or you’ll be able to get a cup of coffee going before you get home. Companies will be able to upload data from these devices onto a cloud-based server and analyze that data using machine learning. They can then use that information to deliver better products to their customers.
Bigger Storage Capacity
As more and more companies start adopting cloud-based technology, the demand keeps increasing. As the outcome, companies will run more data centers, so you can expect to enjoy more storage space in the future. Furthermore, our capability to store data is only going to improve. In the not-too-distant past, a hard drive that could store a few gigabytes of data was impressive. Now, you can get hard drives that hold multiple terabytes for less than $100. Larger storage capacity means big data will become more widespread across various industries.
The cloud computing industry is moving towards a path of innovation and cooperation. Hence, many organizations are looking at adopting an Open Source cloud computing service for their business. Open-source cloud is a service that is built with software or technology that can be customized by anyone. Simply paraphrased, an open source cloud platform lets businesses customize the infrastructure based on their specific needs. With a cloud computing platform that is open-source, businesses can see numerous benefits. They can quickly scale their cloud infrastructure, adding features is much simpler than with a closed-source platform, and there are not so many security concerns. The tech industry is facing a collaborative work environment and opting for an open-source cloud computing service seems to be the right direction for new business or ones that are scaling. This is why many experts claim that open source is actually the future of cloud computing.
It’s safe to say that the next decade of cloud computing will be just as eventful as the last. Further on, CIOs, CTOs and the organizations they work for will face increasing challenges to not only remain competitive in this ever-changing cloud computing environment, but make sure they stay on the right track of both existing regulations and new ones as they come up.