Lynne F. Canavan Executive Co-Chair of OpenFog Consortium swung by to clear the air about Fog Computing and how it will revolutionise future technology as we know it.
Can you tell us the fundamental differences between Fog and edge computing?
Fog and edge are often used interchangeably, but there are key differences in these complimentary approaches. Fog bridges the continuum from cloud to things and provides the missing link in what data needs to be pushed to the cloud, and what should be analyzed locally – at the edge. In the IT and IoT food chain, we describe it as edge is to fog, as apple is to fruit. Edge is often optimized for a certain type of network, while fog extends the edge with support for north-south, east-west and diagonal connectivity, including interfaces between edge and cloud. Fog extends the traditional cloud-based computing model where implementations of the architecture can reside in multiple layers of a network’s topology. These extensions to the fog retain all the benefits of cloud computing, such as containerization, virtualization, orchestration, manageability, and efficiency.
Can you give us a brief overview of the OpenFog Reference Architecture?
The OpenFog Reference Architecture (OpenFog RA) is a medium- to high-level view of system architectures for fog nodes and networks. Created by a broad ecosystem of OpenFog members – university researchers, industry architects, experts, etc. - it is intended to help business leaders, software developers, silicon architects and system designers create and maintain the hardware, software and system elements necessary for fog computing.
The OpenFog RA is driven by a set of core principles called pillars. These pillars represent the key attributes that a system needs to embody the OpenFog definition of a horizontal, system-level architecture that provides the distribution of computing, storage, control, and networking functions closer to the data source (users, things, etc.) along the cloud-to-thing continuum. The eight pillars are Security, Scalability, Open, Autonomy, RAS (Reliability, Availability, Serviceability), Agility, Hierarchy, and Programmability. The OpenFog RA also contains selected use cases in early adoption scenarios for fog computing. The 162-page document can be downloaded at www.openfogconsortium.com/ra
What are some applications that take advantage of Fog computing?
Applications which benefit the most from fog have the need for SCALE: Security, Cognition, Agility, Latency and Efficiency:
- Security: Additional security to ensure safe, trusted transactions
- Cognition: awareness of client-centric objectives to enable autonomy
- Agility: rapid innovation and affordable scaling under a common infrastructure
- Latency: real-time processing and cyber-physical system control
- Efficiency: dynamic pooling of unused local resources from participating end-user devices
The ideal use cases require intelligence near the device or edge where sub-millisecond latency is critical, are run in geographically dispersed areas where connectivity can be interrupted, and/or the applications create terabytes of data that are not practical to stream to the cloud and back. Some of these applications include autonomous vehicles, drones, connected factory, surveillance, emergency response, energy exploration, smart grid, virtual reality, and connected agriculture, to name a few.
How will Fog architecture support the roll out of 5G?
As computing and networking architecture, Fog distributes the services of computation, communication, control and storage closer to the edge, access and users, enabling key applications in wireless 5G. Though a fog-based architecture, services can be orchestrated on-demand and adapted at runtime depending on the contextual conditions to minimize latency and allow for high mobility, high scalability and real-time execution. Fog extends cloud to the network edge, allowing for load balancing and flexible mobility support.
Will there be a dominant standard for edge computing?
Thirty-four years ago, TCP/IP established a common way to assemble the network of networks. The standard approach rapidly enabled the development of the internet. It’s important to see what wasn’t created: there wasn’t an internet for wireless, or an internet for business or an internet for consumers – it was one internet. We believe that similarly, in order to enable IoT, 5G and other digitally-enabled innovation, there needs to be one set of standards developed to allow for interoperability, scalability and security.
How does OpenFog work with other associations like ETSI MEC towards a common goal?
All of the consortia have unique missions, and many have a common goal which is to generate requirements for standards in order to enable technical advancements to drive innovation. At OpenFog, we recognize the contributions of other associations. Our goal is to minimize market confusion and to re-use and leverage work that makes sense to progress the industry. For example, MEC ETSI has done some terrific work with its APIs that support edge computing interoperability that has applicability for the OpenFog Reference Architecture.
Would you say OpenFog is best suited to a specific industry?
Fog computing is something that can be applied to many industries. Fog computing and networking is a horizontal architecture that works across multiple industries – anywhere there is a need for SCALE. With that said, there are certain industries that need fog more urgently - transportation, connected factories, drones, Smart Cities and energy are among the industries who are out there in terms of adopting fog.
How can OpenFog solve advanced IoT deployment challenges?
IoT is the ‘killer app’ for fog computing. Many IoT applications need fog in order to work efficiently and effectively. One of the reasons we haven’t seen more IoT deployments is the overwhelming amount of data that needs to be processed and analyzed – it can be impractical to send all the data to the cloud and back, all day, every day. Think of a wind farm on a remote hillside. Terabytes of data is generated by each wind mill every day. Sending that data to the cloud just doesn’t work: It’s too expensive to transmit the data, the network connectivity that may not always be available, and it’s too slow. By the time data can be analyzed, there could be a serious malfunction. In a fog architecture, the data is captured, analyzed and acted upon via the local fog node, which can send information on an as-needed basis to the cloud, and alert/resolve incidents much more quickly. Another example is in a smart city, where municipal networks may carry sensitive traffic and citizen data, as well as operate life-critical systems such as emergency response. Fog computing addresses privacy and system security by processing the data more locally, diminishing the amount of digital traffic of going to the cloud and back. IoT has tremendous promise, but it won’t really gain traction until the challenges with latency, data costs and transmission, privacy/security challenges and remote connectivity challenges are resolved. Fog is the distributed architecture to make IoT work.
Where is Fog computing moving over the next few years?
There is a big uptick in interest in fog computing, as more recognize it to meet the needs of IoT, 5G and embedded AI. Over the next few years, standards will emerge and there will be more full scale deployments across many industries.
Along with their pre conference workshop, OpenFog will be featured in various panels and presentations about current industry topics. More details on these can be found in the brochure below: