Textual description of firstImageUrl

Internet of Things demand Edge And Fog Computing models

Internet of Things demand Edge And Fog Computing models

The rise of connected devices now started demanding new edge architectures. The way we process the data is no longer applicable due to different factors. These different factors are volume, velocity, and variety of data continuously generated by different sensors or connected devices. These three attributes of data vary depending on the different industries. 

Let's look at the some of the numbers from Gartner.
In 2014, Gartner predicted that by 2020, there would be 26 billion devices connected to the internet. This is staggering number though the numbers are bit old if we discount these numbers by 30 percent then still the total connected devices would be around 19 to 20 billion. In Feb 2017, Gartner predicted that there would be 8.5Billion devices connected to the internet by the end of 2017.


The edge computing paradigm:

Due to a large number of connected devices and their intercommunication requirements, a true decentralized and distributed architecture is on the rise. These architectures need to process the data locally to make quick and real-time decisions. This requirement has given birth to Edge computing paradigm. Moving computing power local to edge devices opens up many possibilities and brings in lots of advantages. E.g. Since data processing from the sensors is done at the edge location, there would not be any round-trip latency which would be the case with current cloud computing model. It is not just IoT but the artificial intelligence and machine learning that are driving the demand for edge computing architecture. Machine learning models can be trained in the cloud and deployed in real-time. The primary objective of bringing processing capabilities of the cloud to the edge devices is reducing the latency in decision making.

The key advantage of quick decision making is life-saving in some of the use cases such as driverless cars. Edge computing would solve a lot of the issues facing infrastructure and robotics.

Fog computing paradigm:

Fog computing, a term coined by Cisco, essentially an extension of edge computing paradigm. It infers that data storage, network communication and computing are stored near to edge devices so that latency is reduced. A quick analogy to Fog computing in the current implementation is CDN, Content Delivery Networks. 

Below diagram depicts Cisco view of fog computing architecture:

CISCO fog computing architecture



These CDN nodes cache static or recently accessed data to nodes that are closer to the user accessing data thereby reducing the overall latency. 

The OpenFog Consortium was formed to define and promote fog computing. The consortium, founded by ARM, Cisco, Dell, Intel, Microsoft and Princeton University in November 2015. It seeks to create an architecture and approach to fog, edge, and distributed computing. Through a series of working groups, the consortium is developing an OpenFog architecture, addressing security issues, and planning industry testbeds.

Consortium membership is open to all organizations and individuals with an interest in planning and developing the architecture to support the next generation of IoT.

While implementing fog computing or edge computing looks complex but it is necessary to support growing demand for IoT sensors and devices where cloud model cannot cope up with. These models solve various problems such as reduced latency, geographic focus and security and unlimited network bandwidth.

As cloud computing led to different business models and drove the innovation for a long time, the edge computing will also lead to innovative business models, applications and services that are still not yet imagined.

By Mandar Pise

Opinions expressed by techsutram contributors are their own.

No comments:

Post a Comment

    Your valuable comments are welcome. (Moderated)


Instagram