This article is the first in a series devoted to the cloud technologies market and the role of fog computing for its participants.

Today, the opportunities presented by cloud services and data storage are transforming business processes, just as the introduction of computers once did. But for business, one unresolved challenge remains important: how to provide scalability for cloud technology.

As the burden on IT increases, cloud technologies are becoming more expensive and are eating up a substantial part of business revenue. The more computing power a company needs, the more the cost of renting cloud infrastructure increases.

The high cost of the services of cloud providers is a result of the vast sums they have spent on the creation of data centers, the creation of network infrastructure and the purchase of server equipment.

Modern cloud technologies have two key aspects: technical and economical. The technical aspect lies in the fact that whereas earlier all computing was done locally on computers within a corporate network, today it is possible to “outsource” this work to data centers. If earlier users installed Microsoft Office on every computer, then now they also have access to the Google Docs and Office 365 cloud services. These services operate remotely and have no direct relationship to the computer, and all operations are carried out in the browser or on remote servers.

From an economic point of view, this fundamental transformation is manifesting itself in the fact that in practically every sphere we are witnessing a move from an ownership model to a temporary use model. The most obvious example of this today is car-sharing, but the same process is underway in IT: while earlier products were sold for a one-time payment, now the model is based on monthly subscription. This means that CAPEX – one-time capital expenditure on the purchase of software – is being squeezed out by OPEX, monthly operational costs.

The model for use of cloud services

In B2B this is undoubtedly reflected in the mutual relationship between the supplier of cloud solutions and business. There are traditional on-premise solutions, via which the company buys servers and installs them in its office or in a colocation data center. However, the business incurs a large one-time expenditure for a fixed volume of resources, which will be unused for part of the time. That is, if a company is planning to increase the workload on the server 10 times in a year, then it will immediately buy equipment for the necessary volume, and for some time this capacity will stand idle. If the company connects to the services of a cloud provider, then it can change its tariff plan at any time: there is always the possibility to increase or reduce the tariff. This way, use of servers on an as-and-when basis provides business with more flexible opportunities than direct ownership of personal computing equipment.

More importantly, the cloud provider can use the hardware for a long time: the lifespan of this equipment can be around 10 years, because it will be used for different tasks, and this will extend its service life. Meanwhile, the lifespan of hardware in a typical office is three-five years, after which it becomes outdated and is no longer relevant.

There are three basic categories for the use of cloud technologies: IaaS, PaaS and SaaS.

IaaS (Infrastructure as a Service): This is services on the level of infrastructure, i.e. basically an alternative to a proprietary cloud. In this case the customer rents a cloud server from the provider.

PaaS (Platform as a Service): This is a server that comes with minimal, preconfigured software. This could be databases or, for example, services for the processing of a particular type of computing. In this case the user does not need to worry about how to scale a business or make a backup – they receive pre-configured solutions.

SaaS (Software as a Service): this is a full-fledged service, the best example of which is Google Docs.

Suppliers of cloud services

More than half the market is made up of three companies: Amazon Web Services, Microsoft Azure and Google Cloud Platform. On a global level, they have no serious competitors on the market for cloud servers and data storage.

Competitors are appearing primarily in the regional markets. The data centers for global cloud platforms are not spread all over the world. If you need a swift connection, for instance, in China, you are obliged to choose from regional suppliers of cloud services. The need to search for solutions on a regional level may also arise as a result of the requirement to observe laws on the storage of personal data, for example, GDPR in the European Union, which came into force on May 25, 2018.

There are also companies working on the market that provide businesses with separate physical servers, such as SoftLayer, OVH, LeaseWeb and Hetzner. But strange as it may seem, the consolidation we are seeing on the cloud market is not taking place on the bare metal market, probably because it is impossible to offer some kind of innovative superstructure that would give a service like this serious advantages.

We can pick out one more category of suppliers of cloud services: companies that provide inexpensive virtual machines, such as Digital Ocean and Vultr. These companies offer their virtual machines for $3 to $5, but this is a very small market with low margins. Digital Ocean offers 1.5-2 million simultaneously running virtual machines at an average price tag of up to $10 monthly per unit. By the standards of the cloud market, the total volume of which grew by 29 percent to $117 billion in 2017, this is very little. Of this $117 billion, around $25 billion is represented by IaaS, around $17 billion by PaaS and all the rest is SaaS. However, 60 percent of the market is in the U.S., although in 2016 this indicator was slightly higher – 62 percent. This means that the market for cloud technologies is now beginning to develop actively in other countries, too.

The world is still only halfway through, if not at the very beginning of, the switch from a CAPEX to an OPEX economic model, at least from the point of view of the transfer of computing infrastructure from private computers and in-house company networks to a cloud server. If in the U.S. has this transition already taken place in 60 to 70 percent of companies, in most European countries this indicator stands at 10 to 20 percent. In developing countries, China for example, only the most technologically advanced telecom company has started the processes.

This all demonstrates that a very high percentage of the existing IT load could be transferred to the cloud. Furthermore, it’s worth noting that IT is the only sphere of the economy that is showing stable year-on-year growth. In April 2018, analysts from Gartner published the results of research into the global cloud resources market. In 2017, the sphere grew by more than 30 percent thanks to IaaS infrastructure services and software provided as an SaaS, and the total amount spent by consumers and companies on cloud services provided through public access was $153.5 billion vs. $118 billion in 2016. Also, according to statistics, sales of IaaS solutions have grown from $25.3 to $30 billion, and the SaaS and PaaS segments have grown from $38.6 billion and $7.2 billion to $60.2 billion and $11.9 billion, respectively.

According to our reckoning, Amazon has no fewer than 1 million servers. There is also the large number of bare metal providers, each of which has 150,000 to 300,000 servers. But this quantity of hardware cannot be compared with the number of personal computers at the disposal of ordinary users or with the amount of hardware being actively used for mining. And here we come to the question of the efficiency of their use.

The SONM platform offers the use of free capacity of hardware that is already running: PCs, servers and mining farms for fog computing. It has been created for the effective use of all this vast infrastructure: It is a marketplace where users can “sell” the computing resources of their devices. In this approach, consumers do not spend money on the purchase of new equipment and the creation of data centers but use the aggregated computing power of PC owners and servers to solve their computing tasks. The fog computing model will thus become an alternative to the servers of Amazon and Google, and on a local level will solve problems raised by legislation on data storage.

Oleg Lyubimov is chief operations officer at SONM, a member of the OpenFog Consortium.

eletter-07-30-2018

eletter-07-31-2018