Why is Everyone Talking about Edge Computing?

Edge computing has been a hot topic for many IT companies the last year. Technology vendors and cloud providers are releasing products relating to edge computing, and you hear about it in keynotes of major conferences and the press. The purpose of this post is to describe what it is and why it is getting so much interest.

There is no strict definition of what constitutes edge computing, and I've seen a few different explanations. Broadly, it refers to computing that for one reason or another is happening outside cloud providers or significant data centers. The term edge computing could, therefore, apply to small data centers or on tiny, rugged servers located out in the wild (in cupboards, wind turbines, cars, trains, etc.). Although it is a new term, the concept isn't. Running servers outside the cloud or other large centralized data centers is pretty much the norm for a lot of enterprises with remote or branch offices. The question is why everyone is excited about it now. People didn't use to be very enthusiastic about ROBO?

The main reason why edge computing is becoming important is mostly related to the rise in the internet of things (IoT), but also other new technologies such as machine learning, augmented reality and autonomous vehicles (wow that's a lot of buzzwords). As IoT seems to be the most important factor, let's cover that first.

IoT is a concept that refers to objects connected to the internet and communicating with other objects. These objects could be anything such as a machine in a factory, a till in a shop, a security camera, etc. The idea is that there will be a lot of objects connected to the internet in the future and that most business (no matter what vertical) will use, or even must use, IoT to gain a competitive advantage.

Businesses would use IoT to collect and analyze data, and it is here machine learning comes in. On a very high level, machine learning works by finding patterns in a large amount of data, usually by training neural networks. You need a large amount of data and fast processors to get useful results. The more data you have, the better. It would not be practical to send all this data to a central cloud and to analyze it close to where it is collected requires edge computing.

Although IoT and machine learning, is often referred to as the main reason why edge computing will become important, there are other, related reasons such as:

  • Critical Decisions: If you have an application that is controlling something critical (medical equipment, industrial machinery, etc.), you couldn't risk having this located in a central data center or cloud and rely on an internet connection for the system to function. The only solution would be to run the application at the edge.

  • Network latency: A lot of applications that controls or affects things in the world (think autonomous vehicles, augmented reality, etc.) require very low latency, and there wouldn't be enough time to wait for a response from a cloud or central data center. We can't break the laws of physics, and the only solution would, therefore, be to use edge computing.

  • Data protection/privacy: Some data can't leave specific regions for compliance or data protection reasons. It would, therefore, have to be stored and processed in a particular location.

Due to the growth of the types of technologies mentioned above, the idea is that a lot (or possibly most) of the computing capacity growth in the future will be at the edge. Note that this doesn't mean that things like cloud computing won't grow, it undoubtedly will. The key is that considering how much worldwide computing capacity is predicted to increase in general, and a significant portion (or most) of this will be at the edge, means that this is set to become an enormous market. Hence the massive amount of interest from IT businesses.

Another interesting effect of edge computing is that it looks like we are moving away from a centralized computing world again. I.e., enterprise IT went from centralized during mainframe days during 1960 - 70 to a distributed client-server during 1980 - 2000. A lot of people probably thought that the move to centralized IT in the cloud era was final, but it now looks like we are going back to a decentralized model again due to technologies such as IoT and machine learning. Let's see what happens, but this is a pretty exciting development.

Decentralized Computing

Show Comments