Moving Beyond the Digital Twin Proof-of-Concept Stage

March 28, 2023
The benefits of digital twins for real-world industrial operational benefits are growing as more companies apply the technology. To ensure your success, make sure your data fundamentals are in place first.

From fulfilling basic palletizing operations to developing strategic approaches to supply chain optimization, digital twins are increasingly being used across industry for a range of applications.

Major reasons for this uptick in industrial digital twin use are the clear benefits the technology provides. As noted in our recent feature article on industrial digital twin applications, a Capgemini Research Institute survey showed that businesses using digital twins have seen, on average, a 15% uptick in metrics such as sales, turnaround time, and operational efficiency. This same survey also showed that companies using digital twins have seen improvements on the order of 25% for system performance and, in some cases, a 16% boost in sustainability.

Now, Capgemini Americas’ Michael Duffy, vice president of intelligent operations, and Rudy Klecka, solutions architect, are noting that it’s time for those companies still evaluating digital twin technologies to move beyond the proof-of-concept stage and begin using real world applications of the technology.

“Many digital twins got stuck in proof-of-concept phases that relied on proprietary programs with their own ways of naming and storing data,” says Duffy. “As a result, many organizations in the sector haven’t yet realized a strong return on their initial digital twin investment.”

See the feature article on industrial digital twin applications noted above.

Data fundamentals

Key to effectively moving beyond digital twin proof-of-concept projects is to ensure you have the right fundamentals in place.

“Rather than try for an ambitious end-to-end, full-lifecycle digital twin, it’s better to start with one part of a process, role, or asset that could benefit the most from digital twin-enabled modeling and optimization, and then identify one stage of the lifecycle to model,” says Klecka. “For example, a digital twin for maintenance at a chemical plant that’s already operational will look very different from a plant design digital twin for a chemical plant that’s still in the planning stages.”

Once the scope of the digital twin is defined for your implementation, Duffy says the next fundamental is interoperability. “In fact, interoperability—which involves decoupling data from the software that uses it— is what matters most. That’s because the digital twin will need data from multiple sources, and while each software solution classifies and stores data in its own way, all the data needs to be standardized and consistent for the digital twin to use it.”

Duffy notes that industrial businesses, particularly those in the energy and utility sectors, face an additional challenge when it comes to standardizing data from operations technology devices.

“These devices have a consistent nomenclature based on their role and their location within the facility. For example, a pressure transmitter on a line, monitoring a particular process, may be the exact same model of transmitter that exists to monitor different processes in different parts of the plant. And the same array of transmitters, each with a different process to monitor, may exist at each of the organization’s plants around the world. That’s why the naming standard for those transmitters needs to be consistent within and across facilities, and it needs to factor in the transmitter’s process and location. If these data sources are labeled consistently, the data they produce will have a common name that makes storage and retrieval easier. That saves the organization from needing to constantly clean up data, which would impede the ability to scale.”

Standardized data formats and ontologies, such as ISO14224, will help businesses access and use their data across design, operations and maintenance.

“Instead of pooling non-standardized data in data lakes, organizations can store and update standardized data in the cloud,” says Klecka. “Standardizing data also allows organizations to pool data from multiple facilities that were built years or decades apart, which use different technologies, for modeling and process management.”

How cloud technology can help

Klecka adds that cloud technology also helps provide digital continuity that wasn’t possible in the early days of digital twins.

“Digital-twin modeling and cloud-based process management, driven by standardized, unified data, can deliver a range of benefits. These include delivery optimization, emergency preparedness, and enhanced resiliency in the face of new climate challenges,” he says. “For example, half of energy and utilities companies are already using digital twins for emissions predictions to assist with planning and to track progress on greenhouse gas emissions reduction.”  

Duffy notes that digital twin-enabled models are also changing the employee experience for energy producers. “One oil and gas company is using digital twins of its platforms to allow engineers and other experts to do more than 4,000 hours of work onshore, rather than traveling to offshore platforms. Another producer has cut offshore hours by up to 50%, reducing the amount of time employees are exposed to risks on their platforms at sea.”

About the Author

David Greenfield, editor in chief | Editor in Chief

David Greenfield joined Automation World in June 2011. Bringing a wealth of industry knowledge and media experience to his position, David’s contributions can be found in AW’s print and online editions and custom projects. Earlier in his career, David was Editorial Director of Design News at UBM Electronics, and prior to joining UBM, he was Editorial Director of Control Engineering at Reed Business Information, where he also worked on Manufacturing Business Technology as Publisher.