We have all been watching the automation world undergo significant change as it responds to the disruptive digital technologies of cloud, Big Data, edge, the Industrial Internet of Things (IIoT) Artificial Intelligence and Machine Learning, wireless, low cost computing, and even lower cost smart sensors. Oh, and don’t forget the information technology (IT) and operational technology (OT) convergence either. So, as we start the new year of 2020, what does all this look like, how is it shaping up, and where is it going?
First, let’s define what we mean by the New OT Ecosystem. My background is the process industries, so I am going to look at it through that lens, though the differences with batch, hybrid, and discrete are few. Using the reference ISA-95 architecture, what we now have in the picture are:
· Some new components (wireless, edge, and cloud);
· The compression of the layers, in particular layers two, three, and four; and
· New software offerings, like Digital Twins.
Simply, what this means is that to take advantage of the capabilities from these new technologies, we can no longer engineer the traditional instrumentation, actuation, and control system levels one and two in a vacuum, air-gapped from level three and above. While we will continue to have routers and gateways, there is no more air gap. And just as we now have lots of IT within OT, there is no IT/OT organizational gap either. The best performing companies—those who LNS has identified as Industrial Transformation leaders—have addressed this organizational challenge using various collaborative models.
But there are still challenges to face and battles to fight. Let’s discuss four here. First, OT needs to step up or face having IT make many of the technology decisions that OT should be involved in making. LNS’s research indicates that IT takes the lead in choosing cloud, edge, and IIoT platforms, and along with it, the advanced analytics applications. And isn’t it the plant personnel who have to adopt and use these new tools? Don’t be left out, OT. There are three roles that need filling to align properly with IT:
1. OT architect;
2. Data engineer; and
3. Cybersecurity specialist.
There is no The Open Group Architecture Framework for OT, but perhaps there should be. OT and IT can work together to define such standards for OT. OT also needs data engineers, those who understand who needs what data, when, and in what format, and with what timeliness and granularity to make operating decisions. OT’s data engineers face up with IT’s DataOps. And finally, cybersecurity remains a major challenge at all levels of the company, especially in OT, as most companies’ CISO’s do not have an OT background and report to the CIO.
The second challenge is the battle for open control systems. Who will win the race, Open Process Automation Forum (OPAF) or OPC-UA? I refer to the battle between systems that can communicate with each other via OPC-UA, and those whose components are interoperable. It’s too soon to tell how OPAF will turn out as it enters its pilot phase, but there’s no slowing down OPC-UA. Either way it’s not hard to see that hardware will become more commoditized over time. The good news is that control systems are opening up with new capabilities, such as the ability to:
· Configure and test the system virtually without hardware;
· Virtualize controllers so that capacity can be scaled like cloud computing; and
· Run advanced software programs like those written in Python at the controller level.
The third challenge is now that we have all of these capabilities, where do we put them? In the cloud, at the edge, on-premise in an advanced computing platform, in the controllers or even in the instruments themselves? Thus, architecting levels one and two cannot be done in isolation from the other levels. We also want to avoid the old architecture trap of writing special control scripts or code that can only run in a specific vendor’s controller or application device. A challenge will be, “Do we migrate this old code to a new ‘open controller’ or move it up to the advanced computing platform or even a higher level?” Of course, no matter which path one takes, there will be a cost, but operating companies must ask themselves, “What is the total cost of ownership? What gives us the most operating flexibility, agility, and time to value?” All good questions.
And that leads us to the last challenge and that is, “Who is running the show?” By this means that we are seeing the emerging dominance of software over hardware taking place. Here, I don’t mean control system software. I refer to manufacturing execution and related management systems, which reside on level three and above. It strikes me that we often engineer plant systems starting with level one and two elaborated from the probability of failure on demand and piping and instrumentation diagram, and then add on the manufacturing execution system and manufacturing operation management later. This is certainly the case for Brownfield retrofits. But it has me thinking that maybe we should engineer it from level three down based on the capabilities that we want to operationalize. When plants are conceived, an operating philosophy is developed. But then we build from the bottom up and hope we build something that enables the philosophy. This doesn’t make sense to me because with today’s digital engineering suites, we can design and build an intelligent plant from conception, that post-handover rapidly achieves first quartile operating status. And hence also a reason why many of the larger players are developing suites of software to manage the entire asset lifecycle.
>>Joe Perino, [email protected], is a research analyst at LNS Research.