Landing page for Open Manufacturing Platform in GitHub
The Internet of Things (IoT) has begun to be widely embraced in manufacturing across sectors from automobiles to semiconductors to baked goods, clothing, and more, there is virtually nothing that is manufactured that can’t benefit from the analysis of data collected by sensors that make up the Industrial Internet of Things. And the growth trajectory is staggering. Statista forecasts that by 2025, IoT connections will produce almost 80 zettabytes of data. It seems natural that the ability to gather and analyze this much data will improve processes and drive productivity, but IoT has limits.
The typical architecture of an IoT system consists of any number of local sensors that send data to the cloud to be processed using advanced algorithms, insights are sent back to machine or human users, and then the data is stored in the cloud. If you have had the opportunity to tour a semiconductor manufacturing facility, for example, you know that the time it takes to send manufacturing data back and forth to the cloud can mean 10s or even 100s of thousands of dollars in products produced that don’t pass inspection. In today’s manufacturing environment, tolerance for latency is extremely low. In addition, costs associated with transporting, processing, and storing data in the cloud are extremely high.
Edge computing is a broad term used to describe IoT data processing onsite or as close as possible to where the data is generated. In the manufacturing context, edge computing describes a system of decentralized edge nodes, which reside near the physical origin of the data. Edge nodes must be able to run arbitrary containers and are managed centrally. An edge node connects to both the cloud and production asset levels and can temporarily run offline. Edge computing brings the following benefits to IoT in a manufacturing setting:
Low-latency processing: if data doesn’t have to be sent to the cloud to be processed, micro-seconds can be shaved off of quality, security, or safety-related notifications, decisions, and corrective actions. Picture a scenario where trained Machine Learning (ML) algorithms are applied to data at the Edge, insights are generated, and immediate actions are taken where the data is produced rather than relying on distant, centralized cloud resources. In high precision manufacturing, robotics and process automation require AI located on-premises to ensure real-time responsiveness. Of course, this doesn’t negate the benefit of processing in the cloud where appropriate. For example, connected machines and sensors provide new insights into predictive maintenance and energy efficiency across disparate geographic locations and are connected via a centralized cloud architecture. The point is that some applications require local, low-latency processing, and some applications benefit from processing in the cloud. A common reference use case is helpful in understanding when and how edge computing is beneficial.
Less use of bandwidth: Along with the cost of processing and storing data in the cloud, there is a cost to sending volumes of data back and forth to the cloud. In terms of both communications service provider fees and slower network speeds, data transport is costly, especially in the volumes produced by manufacturing companies. While useful data will likely eventually be sent to the cloud and stored there, processing at the edge reduces the total amount of data sent to the cloud because unnecessary data is filtered out during preprocessing at the edge. And when data is processed at the edge, it can remain onsite temporarily and then be transported to the cloud for long-term storage during off-peak times when data transmission rates are lower and bandwidth is more available.
Security and data sovereignty: Some types of data should not leave the security of your own network. Personal customer data, employee data, intellectual property, and trade secrets are all data types that may need to stay securely within your own corporate campus. By processing locally, data doesn’t need to be sent over the public Internet or stored offsite. This helps increase security and enables easier compliance with data sovereignty laws.
Equipment connections: Conversely, there are times when you want to connect devices to the cloud, but the devices themselves do not have the capability to connect directly without a gateway device from your network. Edge equipment enables a path to the cloud or external networks for equipment such as cameras or sensors that may not be Internet-ready. If these items are connected within your own network, edge equipment can act as a gateway to the Internet and cloud infrastructure so that data from your devices can be aggregated and processed at a centralized data center or cloud.
Buffering and process decoupling: Edge equipment can enable data to be locally buffered in case of external network outages. This way, data is not lost if there is an Internet outage, and closed-loop operations can be fully executed on the local edge device without dependencies on the broader network infrastructure. In short, if the Internet goes down and you cannot access external processing resources, your manufacturing operations can continue.
Download the Free Edge Computing Whitepaper
While edge computing is an excellent alternative for low-latency, low-bandwidth, secure computing, there are challenges that remain. There is no consensus on a standardized definition and architecture for edge computing. For this reason, the Open Manufacturing Platform’s IoT Connectivity Working Group has produced the whitepaper, Edge Computing in the Context of Open Manufacturing. This paper approaches the topic of edge computing from a manufacturing use case perspective with three different views: an infrastructural, an application, and an operational view. Based on this approach, an edge computing framework’s core characteristics and components are identified and described. The main contribution of this paper is to outline edge computing in a manufacturing setting and start moving the sector towards a common understanding.