Computing Architecture—Edge or Cloud?

Computing Architecture—Edge or Cloud?

A brief look at the fast-growing and ever-evolving technology of edge computing, filled with the power to revolutionise computational capabilities around the globe.

What is Edge Computing?

Edge computing essentially deploys data or applications at the network ‘edge’, closer to the end-users. It is a distributed information technology architecture in which client data is processed at the network’s periphery, as close to the originating source as possible. As a result, it brings computation and data storage closer to the devices where it’s being gathered, rather than relying on a central location that can be thousands of miles away. This prevents latency issues, making the application far more efficient.

We know that data is the lifeline of modern business, of all sorts, irrespective of the type. It provides valuable business insights and claims support over critical processes and operations. Today’s businesses are buried in an ocean of data that routinely need to be accessed. We are heavily dependent on IoT devices and sensors that operate from remote locations over a single server leaving the system open to vulnerabilities.

Edge Computing and Cloud Computing

Edge computing is based on a distributed paradigm, while cloud computing refers to a paradigm built on a centralised data centre. In simpler terms, edge computing moves some portion of the storage and compute resources out of the central data centre and closer to the source of the data itself, as mentioned earlier. So, rather than transmitting raw data to the main structure for processing and analysis, the work is performed where the information is generated; it could be a retail store, a factory floor, a sprawling utility or across a smart city, any venue possible.

Statistically, cloud computing has been a hit for big and small companies alike. More than 28 per cent of an organisation’s total IT budget is now kept aside for cloud computing. Also, 70 per cent of organisations have at least one application in the cloud for use. Studies show a similar pattern of dependence on cloud computing as the primary focus for all companies. However, experts have predicted the growth of cloud computing as blurry and on a decline. They believe the cloud has reached the end of its run at the top and has been exhausted.

According to the Massachusetts Institute of Technology study, the global edge computing market will reach compound annual growth of more than 37% between 2020 and 2027. Edge computing will take its place in a spectrum of computing and communications technologies as another place where system architects can place computing workloads—neither on-premises nor in the cloud but at the edge.

Is the shift worth it?

Let’s visualise an example- if we look at Siemens Trade Show Presentation, it gives us a practical application of edge computing for predictive maintenance on the shop floor. They used a robot responsible for catching electronics components as they fall off the assembly line and place them back in packaging for shipment. If the robot breaks down, the products fall on the floor, and the line has to be stopped for emergency maintenance.

A device with more edge computing power than the one in use can help gather sensory data from them. This will be used to predict any future malfunctions, such as the sudden failure of a robot’s suction gripper. Maintenance schedules can be planned well ahead of time to prevent expensive damages. The app on the edge device feeds data into even more powerful machine learning systems in the cloud, which improve the predictive algorithms and feed updates to the edge device.

Edge computing was developed for an effective system of dealing with the enormous data created by the growth of IoT devices. They connect to the internet for either receiving or delivering data to and fro from the cloud. It helps where connectivity is unreliable, or bandwidth is restricted due to unavoidable reasons like ships at sea, remote farms, deserts etc. Because of its ability to do the computing work on-site, it reduces the amount of data to be transmitted, lowering the bandwidth.

As the data isn’t being transferred, it helps keep the data within the bounds of prevailing data sovereignty laws. Any data going through the network back to the cloud or data centre can be secured through encryption. The edge deployment itself can be hardened against hackers and other malicious activities—even when security on IoT devices remains limited.

Overall, it can help lower dependence on the cloud and improve data processing speed as a result. Besides, there are already many modern IoT devices that have processing power and storage available. The move to edge processing power makes it possible to utilise these devices to their fullest potential.

What is the future?

There is undoubtedly a shift inclined towards edge computing in recent times, but that isn’t the only solution. Cloud computing remains a viable option for specific challenges faced by IT organisations and vendors. In some instances, they use it in tandem with edge computing for a more comprehensive solution. Delegating all data to the edge is also not a wise decision. It’s why public cloud providers have started combining IoT strategies and other technology stacks with edge computing.

Functions are best handled by splitting the work. Computing between the end device and local network resources can be done at the edge. Meanwhile, big data applications that likely have aggregated data from everywhere and needs to be run through analytics and machine learning algorithms can stay in the cloud. It isn’t an either-or debate, nor are different types of computing essentially competitors. It depends on the organisation’s needs and capacity. To make this hybrid solution work, one needs to know the functionality of the workspace and assess it against the costs to find an optimal and feasible solution.

Written By: Garima Kejriwal