Edge computing at the brink of a decade – expectations, forecasts and hopes
The knowledge regarding current trends in the IT industry, particularly at enterprise level, means everyday struggle with sophisticated buzzwords. An important skill here, is to separate the wheat from the chaff, and to identify, within the most comprehensive and catchy term that, which is important for a particular business. Following the cloud-first period, we are increasingly mentioning edge computing. It is worth knowing what the term means actually, and what are the perspectives of this approach at the brink of the next decade.
What is edge computing today?
Based on the observation of trends from the last five decades, one may propose a hypothesis, that the client-server relation is sinusoidal. To make it simpler, we may divide history into a number of phases, with the first being the era of central mainframes. Later, the ability of remote connection with the centralised infrastructure, through thin clients, various access terminals, was made possible. Only then, an order took place, which for many years seemed obvious and the only one, as access to computers was made available – obviously we are speaking of the PC era.
However, here paths started to branch out (at least seemingly) – while consumer electronics began to strive for miniaturisation and mobility in the form of smartphones, the enterprise infrastructure once again began to centralise and take the form of a public cloud provided by but a few corporations. However, if we think about it carefully, both current PCs and smartphones serve the purpose of thin clients – they are strictly limited (resource-wise) consumer devices, that would be of little use if not for the access to Google, Microsoft or Amazon services provided via cloud.
Edge computing is a continuation of this sine wave of decreasing amplitude. In the edge computing era, the boundaries between external and local infrastructures are becoming blurry, the distance between the server and the client are decreasing (also literally, physically), while generally retaining the centralisation characteristic of the cloud era. Once again calculations performed on assets available locally are playing a larger role, which is particularly important where there is place for no latencies.
Damped sine wave
It is reflected in the hardware proposals of the biggest companies – Google is selling Coral single-board computers, where machine learning may be applied almost entirely locally, and not on the basis of the Mountain View cloud. Additionally, Apple is employing local AI chips, and processes biometric data and other authentication models solely on iPhones. One cannot omit Amazon, that worked on own systems for Echo speakers, that will allow for the local actions of the Alexa assistant. However, the flagship examples of edge computing are the autonomous and semi-autonomous drive systems, which within local systems (i.a. the Full Self Driving Computer produced for Tesla by Samsung or the Drive AGX ORIN board developed by Nvidia), process enormous amounts of environment data on a continual basis, and must react to them immediately.
In such conditions, the use of an external cloud infrastructure is impossible. The latencies in connection, e.g. the data centres in a distance of a few dozen kilometres, could decide on someone’s life or death. Additionally, one cannot omit the advantage of edge computing, being the increased security. However, one should not treat this approach as an opposition to the cloud, but more as a computing power delivered closer to the location, where this power is needed. On the other hand, the service providers who make substantial money on the cloud, are not eager to resign from their position and must balance securing the needs for local infrastructure with the necessity of their continuously growing portfolio.
The shiniest example is the Azure Sphere, an offer that is still in development, combining hardware (an original microcontroller), its local software (a special distribution of Linux) and an external cloud service provided by Microsoft. Therefore, the aforementioned comparison to a damped sine wave is no exaggeration – we are not speaking of the return to a situation, where the entire platform is on the client side, but of a scenario where a server is no longer something external, it goes into the households, however, not losing the connection with the producer’s infrastructure.
What will be the edge computing of tomorrow?
The actual implementations are obviously in an early stage, therefore, the basic source of information regarding the perspectives of edge computing, is analytics. The trend was noted i.a. by the authors of the Forrester Analytics Global Business Technographics Mobility Survey, according to which the hybrid market, referred to as edge cloud service, will grow by 50% this year in comparison to 2019. Interestingly enough, the Forrester analysts assume, that customers will most often build the used services with products of various producers, which may relatively diversify the current marker, strictly concentrated around but a few providers of cloud services.