Iteration required
The second transition in analytics language comes from recognizing that analytics is not a static end point or insight. I would get in trouble inserting a graphic from a vendor that uses this approach, but there are dozens of examples. This approach suggests a hierarchy, a “better than” analytics structure which starts with descriptive analytics (i.e., writing reports), then diagnostic analytics (root cause analysis), then predictive analytics (also predictive maintenance), and finally prescriptive analytics to tell the user what to do (see Figure 1). There may be other, intermediate steps included, but the point is there is a fixed path to greater analytics sophistication.
This hierarchical view of analytics may make for compelling marketing materials (“Find Your Analytics Maturity!”) but it’s simply not realistic. A realistic view of analytics is an iterative, looping, collaborative process where an engineer starts with one analytics type, switches to another, goes back to the first, then does something else, or otherwise moves among analytics types to accommodate the changes in plant priorities, raw material costs, formula changes, and other factors.
Even in the most static environment, process engineers with experience in Lean or Six Sigma techniques knows that good enough today won’t be good enough tomorrow. Again, Gartner is leading the discussion on an iterative approach to analytics. The analytics aren’t static and hierarchical, they are circular and subject to impact from new requirements.
Improved outcomes
The third transition in analytics is in the desired outcome and impact of analytics. The static view, summarized in the prior paragraph, is defined both by its hierarchy and its defined outcome. The highest level, the brass ring, of this approach is “prescriptive” analytics, as if there was a way to define what the user should do given a certain set of data. This is simply not a realistic objective.
At any point in time there will be context known only to the user or subject matter expert, and only at the time of the analysis. This context must be considered when making the right decision for optimizing the production or business outcomes. If the analytic recommends a shutdown of the line when an asset is working but in need of maintenance, how does the decision get made in the context of production objectives and customer commitments? Only the process engineer or plant manager has the right business context to answer this type of question.
Therefore, the desired end point of analytics is not “tell me what to do” but instead “give me insights to inform my decision” based on the process engineer’s ability to tradeoff among outcomes. Another critical aspect of this optimization focus is insights must be achieved in time to make a decision that impacts the outcome. The unsatisfactory—and common—alternative occurs when the analytics take longer than the process to complete, with results delivered after the fact.
Analytics tools therefore need to be available as self-service, ad hoc solutions to plant personnel, and presented in time to make a difference. The boring and banal “actionable insights” from two decades of automation vendor marketing must give way to a focus on empowering and supporting the process engineer or subject matter experts inclusion and perspective in the trade-offs required for optimizing higher level outcomes.