If nothing else, the 2012 presidential election helped illuminate the clear edge that savvy use of big data can lend to a knowledgeable handler. For years, manufacturers have been hard at work mastering the workings of big data (a collection of data sets so large and complex that it becomes difficult to process using on-hand database management tools), but with the accessible and relevant data sets growing so exponentially, it’s clear that current technologies are not keeping up with all the business improvement possibilities that big data presents.
The range of information contributing to a manufacturer’s big data can include: products instrumented to capture data during their use that can then be fed back into improving current products or developing new ones; outside data sources such as government, academia and standards groups; and even social media for monitoring of current and prospective customers. To make any use of all this data for product or process improvement or problem correction, the user has to correlate all the bits of relevant data on hand, interpret them and then use the information to address the issue.
Looking at how manufacturers can better leverage big data, CIMdata (an independent consulting firm focused on product lifecycle management), has released a report titled “Product Lifecycle Management and the Data Deluge.”
In essence, the report notes that many of the functions of current product data management, product lifecycle management, and business intelligence software packages are not quite up to the task of helping manufacturers effectively deal with big data. The problem with these software tools is not their inability to handle a manufacturer’s internal product and production data, it's all the external data that’s increasingly figuring into the process.
CIMdata contends that information discovery and visualization tools are well suited to support existing PLM tools at the job of handling big data. These discovery and visualization tools reportedly combine search and business intelligence to “synthesize data into decision support techniques to enable ordinary business users to search, explore, link, and analyze any type of information.”
The question that remains, however, is this: Is big data something you should be looking into for your operation?
Ultimately, that’s for you to decide, but here’s a rundown of the benefits CIMdata claims the use of discovery and visualization tools can add to your PLM strategy:
• Web content mining for curating a one-stop resource for access to product-relevant data scattered across the Web. This data could include industry regulatory and standards information, open data sets from governmental and academic authorities, supplier catalogs, and part and component specifications and documentation.
• “Voice of the customer” analytics that follow and summarize customer discourse inside the enterprise (e.g., emails, phone calls, letters) as well as on social media. Search-and-semantics processing technology provides a capability to automatically collect social discourse about products and bring that knowledge into the PLM process.
• Real-time sensor data from products in use can provide insight into real usage conditions as well as actual product requirements, product failure information, customer needs and habits, and other real-world data.
See the full white paper from CIMdata, which further explains CIMdata’s support for adding visualization and discovery tools to PLM. The paper also addresses related security issues, data curation, and data persistence.
Also, see Automation World’s coverage of GE Intelligent Platforms’ recent release of an update to the company’s Proficy Historian product designed to handle industrial big data. By combining Proficy Historian 5.0 and Proficy Historian Analysis, GE says the combined product allows companies to contextualize, analyze, and visualize huge amounts of data and act on it to improve their operations.