Edge computing, heralded by many as the next frontier in industrial automation, isn’t exactly novel terrain. However, the ways in which advanced edge capabilities are being leveraged to innovate and optimize everything from infield product performance to quality management on the plant floor is breaking new ground by helping industrial players boost their competitive position.
In conventional scenarios, edge deployments might constitute historians and local SCADA (supervisory control and data acquisition) systems, which are constrained by limited compute power and data storage along with their ability to orchestrate simple analytics, visualizations, and basic realtime controls. These systems—many of which were designed decades ago—are also highly inflexible due to architectures that are siloed and closed, requiring specialized talent and custom code to extract relevant data.
Enter the modern version of edge computing, which packs exponentially more compute and storage horsepower, open APIs (application programming interfaces), support for diverse communication protocols, and connectivity to the cloud to boost flexibility and help manufacturers collect, integrate, and act on data generated on the factory floor or in the field for optimized performance. “The edge was first used as a stepping stone for data collection and some reporting with manual intervention of some sort required,” says Kelli Settell, senior hardware applications engineer at TMMI, a supplier of electrical control equipment and software. “The real change is the ability to start to effect change with data analytics—being able to push back in real time on equipment based on a cause and effect with less manual intervention.”
Spending trends
Manufacturers are catching on to the value of a reimagined edge, working in concert with cloud capabilities, to advance digital transformation. According to the IDC Worldwide Edge Spending Guide, global investment in edge computing is expected to hit $176 billion this year, an increase of 14.8% over 2021, and will continue to grow to $274 billion by 2025. Gartner sees edge computing emerging as a central data hub: While only 10% of enterprise data was generated and processed at the edge in 2018, Gartner expects that scenario to shift to nearly 75% by 2025.
IDC, which identified more than 150 use cases for edge computing across various industries in its report, called out manufacturing operations and production asset management as the enterprise deployments that will see the largest edge computing investments this year. The IDC report also projects that discrete and process manufacturers will dole out $33.6 billion for edge technologies this year, significantly higher than other sectors, including retail and professional services.
The uptick in edge and cloud deployments among industrial players was confirmed by a recent Automation World survey. As manufacturers scrambled to keep factories running and remote personnel productive during the pandemic, they invested in edge and cloud technologies to enhance performance monitoring, collaboration, quality assurance, and predictive maintenance applications. Sixty-two percent of responding companies to the Automation World survey said they were leveraging cloud technologies, and 55% confirmed deployment of edge technologies—both a significant bump over the prior year.
Real-world realities
Modern edge technologies are becoming an essential part of the landscape as companies look to enable connected systems as part of Industry 4.0 applications, including more widespread use of AI/ML (artificial intelligence/ machine learning) and advanced analytics. A redefined edge is also instrumental to capitalizing on perceived benefits such as reduced storage costs, expanded interoperability, lower latency, and enhanced cybersecurity. “The more things are talking to each other, the more opportunity there is for unauthorized access,” says Josh Eastburn, director of technical marketing at Opto 22. “If the architecture constitutes different layers, securing the whole system is a bigger task. By simplifying the overall architecture with edge computing, you can consolidate into a fewer number of devices and build cybersecurity functions into the devices themselves.”
While edge computing gains traction, challenges lie ahead. As with any major initiative, inertia and change management are on-going issues as is finding a new breed of engineer that has crossover OT (operations technology) and IT competencies, says John Younes, co-founder and COO of Litmus, a provider of an industrial edge platform. For example, to fully leverage edge capabilities, organizations need someone who is as versed in the OT communications protocols and visualization requirements as they are in IT competencies like cybersecurity and network protocols. “While we call it the edge, it’s actually the overlap between OT and IT,” says Michael Condon, senior product manager at Emerson. “The individual needs to be able to work in both domains and understand the challenges and requirements of both.”
As manufacturers strive to balance innovation with the need for efficiencies and cost reduction, edge-enabled automation is developing as a means of doing more with less. “The reality of the times, of doing more with fewer people, makes the case for edge computing,” says Settell, citing an example of maintenance personnel heading into the field armed with the knowledge of what equipment and tooling is required to make a fix. With added insights from edge computing, “instead of having to make five trips, they make one, which reduces the number of people needed as well as the number of trips,” she explains.
Following are highlights from four manufacturers using new edge technologies to drive new business outcomes and gain competitive advantages.
Condition-based monitoring
As a Tier 1 automotive supplier, making high precision powder metallurgy components like power transmission gears and hydraulic components can be a complex process with plenty of opportunities for quality issues to occur. Determined to improve process controls and ultimately boost quality of parts, Stackpole International embarked on a plan to connect plant floor equipment and machinery to digitize and automate condition-based monitoring.
One critical process involved a grinding machine that no longer operated consistently, but without an investment in edge computing it was difficult to diagnose the root cause. “The equipment’s HMI didn’t have a lot of data stored and whatever was there was locked up and not being utilized or analyzed for good,” explains Jack Fung, principal engineer, engineering IT projects at Stackpole.
After some unsuccessful attempts at connectivity, Stackpole turned to the Litmus Edge Industrial IoT platform and was able to sync with the proprietary devices in a matter of hours using the pre-installed drivers. Now, instead of manually tracking quality through reports and labor-intensive calculations, Stackpole employs condition-based monitoring to reduce the time it takes to troubleshoot for discrepancies, direct manual interventions, and return to optimal throughput while reducing scrap rates. “We now have faith that having all the data available will drive efficiencies and returns,” Fung says.
AI-enabled inspection
When your product is high-quality building supplies such as siding, visual appeal is paramount. Manual inspection requires significant manpower and can only tell you so much about the root cause of any problem.
Enter the combination of edge capabilities and vision control for inspection and defect detection. Intrinsics Imaging and Opto 22 teamed up to help the building manufacturer drive automated quality control to ensure defective material never makes it to customers and to store that information for defect analysis to help reduce waste. Intrinsics Imaging’s technology bolsters generic IP cameras with smart capabilities by connecting them to an AI backend called Heijunka running on AWS (Amazon Web Services). Opto 22’s groov RIO edge I/O platform serves as the physical interface for the software. Together, the pair detect flaws in the building products with a two-second response time, publish a pass/fail indication communicated to the edge modules via MQTT, and initiate physical relay outputs, which are used to drive an automated response.
Groov RIO’s support for the MQTT protocol was a critical factor in selecting an edge platform to translate the insights generated by the cloud-based vision control system into actions that would impact quality on the line. “We needed to go from the software knowing that something was wrong to triggering a specific action in the physical world, like pushing flawed material off the conveyor into a trash bin,” explains Eric Cheng, Intrinsics Imaging’s CTO. “The groov device enables that translation.”
IIoT-enabled compliance
Historically, the oil and gas industry isn’t seen as a leader in IT and automation technology deployments when it comes to production facility and in-field operations.
Venting from oil and water storage tanks is a primary source of emissions for oil and gas companies, leaving them with the option of flaring or capturing methane gas—both essential tactics for complying with EPA emissions regulations. Traditionally, countless hours are devoted to hand calculations to approximate levels, but the methods were not consistent enough for accurate EPA reporting. “Back-of-the-napkin math is good for getting a feel for how a facility is doing, but to monitor day-to-day, minute-to-minute, or even second-to-second compliance, there needs to be an advanced technician on site,” explains Adam Meyer, president of Lansera LLC, a company working with Phoenix Contact and ClearBlade to leverage IIoT (Industrial Internet of Things) and edge technology to automate continuous monitoring of gas tank emissions.
Using Phoenix Contact’s hardware and ClearBlade’s edge platform and Intelligent Asset application, Lansera created the Smart-Purge and Tank Eye products that enable users to monitor tank pressure and emissions and automatically flare vapor or capture it, ensuring the facility remains in compliance and avoids penalties or fines.
Predictive failure testing
Failure on the production line is not an option when you’re building large-scale diesel engines carrying a price tag from a few hundred thousand dollars to upwards of a couple of million.
Rolls-Royce Power Systems had amassed nearly a decade’s worth of data that could be used to identify potential engine part failures prior to testing. Working with Cisco and partner Delta Bravo, which specializes in predictive analytics, Rolls-Royce deployed an edge system that predicts in real-time potential part failures in the test cell prior to production—a workflow that significantly reduces unplanned downtime, increases throughput and quality, and ensures a timely delivery schedule.
The success of the initiative hinged on guaranteeing that data was captured with the proper levels of granularity to support valid predictions as well as ensuring a process for continuous model iteration, explains Rick Oppedisano, Delta Bravo CEO. “Models need to be tuned and retrained over time,” he explains. “It’s about care and feeding; [it’s] not a set and forget situation.”
Also crucial to Rolls-Royce’s edge environment was security, a requirement that gave Cisco’s networking and edge technologies an advantage while underscoring the importance of OT and IT collaboration, according to Dr. Kyle Hodges, senior control engineer for the engine manufacturer. “It all comes down to security, and IT has to be involved,” he says. “We’re doing things with data that we’ve never done before, and our business will be crippled if we don’t take advantage of network security options.”
Leaders relevant to this article: