- By Kyle Hable
- July 31, 2020
- Manufacturing process optimization requires analytics based on edge-sourced data.
- The scientific method offers an optimal approach for implementing data gathering and analysis to deliver tangible improvements.
- Breaking the problem down by gathering important “little data” and then integrating it to create a big data solution is often the best way to obtain quick and quantifiable results.
Practical data-driven IIoT methods and analytics can speed up operational improvement cycle times
By Kyle Hable
Most discrete part manufacturers are continually questing for ways to improve productivity. While some may follow an approach based on gut feel and instinct, it is more constructive to base operational improvement efforts on hard facts. Attainment of this goal is often hindered, however, by a lack of timely data sourced from field instruments, machines, and automation systems.
By the time a report reaches the C-suite, and a downturn in production or an increase in energy consumption is noted, it can be hard to trace the root cause. For some operations, the process of a continuous improvement cycle may operate on an annual or biannual basis, if at all, and proceed as a time-consuming top-down investigation. But what if these same operational teams had the tools needed to facilitate a more scientific approach?
The right automation hardware and software can support these efforts by taking advantage of Industrial Internet of Things (IIoT) devices and communication to support analytics at the edge or a centralized location. This article explores how edge automation concepts and digital transformation processes support the collection and analysis of data, enabling users to gain the insight necessary to reduce the cycle time of monitoring, analyzing, and improving discrete part manufacturing operations.
A scientific method
Production plants commonly consist of many different types of machinery, equipment, and supporting utilities. At a high level, operations personnel want to:
- improve throughput
- maintain quality
- reduce waste
- maximize uptime
- minimize power consumption.
Sometimes these actions are performed at a relatively microscale portion of machines or equipment. Other times they have a broader scope as part of a macroscale business optimization cycle.
These optimizations are made possible by following a procedural process of continuous improvement in an iterative fashion. An effective optimization process model, based on the digital transformation of industrial systems (figure 1), requires organizations to:
- gather data
- connect it within an architecture
- analyze it
- deploy solutions
Some readers may notice that the cycle of continuous improvement has many similarities to the scientific method of research and learning. Typical principles of the scientific method are expressed as:
- observe and question
- research and hypothesize
- experiment and obtain data
- draw conclusions and report
The scientific method is readily adapted to provide a complete framework for applying digital transformation methods and optimizing manufacturing operations. As an improvement cycle is executed successfully, improvements to the manufacturing process are applied, with the methodology and procedures used also enhanced.
Digital transformation is an integral part of this improvement cycle, as it is part of an ongoing journey to digitalize the data needed to efficiently support these efforts.
Many “little data” sources
The information needed to assess these characteristics may flow through large enterprise software environments, including supervisory control and data acquisition (SCADA), manufacturing execution systems (MESs), and enterprise resource planning (ERP) systems. Some of the data may arrive in a very manual format handwritten on forms or entered via spreadsheets.
Much of the interesting data comes from programmable logic controllers (PLCs) in the operational technology (OT) domain, including production-related values such as operating rates, part counts, and temperatures. Some data may be facilities information sourced over site information technology (IT) systems. Still more data is associated with asset management, such as wireless vibration readings and other parameters. Many times an out-of-the-box asset performance platform is used to consolidate, contextualize, and visualize such information. This can be useful for both machinery operators and entire manufacturing plants.
Such a large variety of “little data” sources complicates the format and timeliness of data availability. Users must consolidate the information, analyze it for useful results, and then apply changes. Then they repeat the cycle, as often and as quickly as practical, in a process of continuous improvement. As we will see, IIoT-based devices and methods can offer a way to streamline much of the procedure by creating a system where information flows more efficiently to speed up the overall improvement cycle.
Starting the cycle
A logical beginning is to define the objective of process improvement by asking, “how do we optimize operations” and hypothesizing “by tuning and adjusting our equipment and manufacturing processes.” To set the improvement cycle in motion, it is necessary to gather all the relevant “little data” so it can be aggregated into “big data,” and then analyzed (figure 2).
To be sure, the task can be overwhelming due to the large number of potential data points. Many implementers find that a valuable preliminary step is to perform an asset criticality analysis. Users evaluate the reliability, detectability, and consequences of equipment performance and failure in an unbiased way to identify the greatest pain points and determine which equipment should be addressed first. Essentially, this ensures that the “low hanging fruit” of optimization efforts are harvested first, leading to early savings and building organization enthusiasm for ongoing projects.
With the most critical assets identified, users drill down into the target asset types with a consistent approach to gather the necessary data. This may be:
- production rates or failure indications from the control platform
- other sensed or analytical values
- equipment health indications, such as vibration or bearing temperatures.
Because of the many standalone automation platforms in most industrial plants and facilities—encompassing legacy systems and communication protocols—it is common for implementations to stumble over specialized approaches for obtaining little data. Sometimes on the first pass it is only possible to use whatever data is already easily available.
Modern IIoT methods and products (figure 3) can help users get data out of isolated platforms in many ways:
- Edge devices: single points of data collection, often wireless, transmitting data to other edge solutions.
- Edge gateways: collect and forward OT, facilities, and asset management system data streams.
- Edge computing: computer-based products able to act as a gateway and perform additional storage and analytical tasks.
- Edge controller: combines deterministic control like a PLC with general-purpose edge computing capabilities.
Edge devices, gateways, computing, and controllers can be added to existing systems as the need and budgets allow. New systems can be designed around edge computing and edge controllers from the beginning, so they are already positioned to obtain and process OT data and make the results available to higher-level systems.
The activity of identifying and connecting with little data at the edge is rarely a one-time event. Indeed, at each iteration of the improvement cycle, users should evaluate any new needed data. This iteration process is necessary to continually build up the data models in support of deeper analysis.
From little data to big data
As the little data becomes available from all types of sources and edge devices, the next question is how to consolidate it, configure it into useful information, and make it available to the OT and IT sides of the business so users can easily access and work with it.
Edge gateways usually provide unidirectional flow of data up to supervisory systems, but they can support bidirectional data flow. Edge computing certainly supports bidirectional data flow but requires users to implement their own security provisions, such as firewalls to make PLCs less vulnerable to attack.
Edge controllers are the most comprehensive solution, suitable for integrating OT data with IT systems, and vice versa (figure 4). Because edge controllers combine deterministic control with general-purpose computing on one device with a built-in firewalled security layer, they have many advantages for edge analytics:
- directly access low-latency source data
- preprocess the data to remove undesirable characteristics
- perform any amount of edge analytics
- transport the data to IT systems using efficient and secure protocols
- native firewall for security between the general-purpose computing and the deterministic control
- locally loop results back into deterministic control.
With IIoT tools at their disposal, users can aggregate all of the little data into their own big data, hosted on site or in the cloud. Big data in the form of time-based historians and record-based databases is the foundation for detailed analytics.
Complete, and repeat, the cycle
With meaningful contextualized data in hand, and with analysis performed at edge controllers or in a central computing or cloud-based system, users can make informed decisions to improve their manufacturing processes. Trials can be run with varied inputs, and users can see results quickly. Based on this more complete information, they can identify new data points that may help with the analysis and build up of operational models. At this point, they can decide whether to manually optimize their equipment based on the results or automatically apply optimizations.
After achieving initial success, the team will have proved a methodology they can repeat over and over, with increasing efficiency and speed, for applying the scientific method to process improvement.
Breaking it down
Large software projects bring to mind huge initiatives with significant spending over many years, frequently accompanied by cost and schedule overruns. However, there are improved ways to find success.
In the software development world, agile frameworks such as Scrum break down tasks into smaller modules or “sprints,” and then execute each in an iterative manner to achieve success. This is a way to reach a larger goal by executing many smaller and more easily achievable steps.
Similar concepts can be applied to industrial improvement efforts founded on digital transformation. In this case, IT/OT integration projects are broken down to approachable little data tasks and built up into a big data integration solution. Each little data task can be evaluated and validated locally and economically before committing it to the greater big data role.
IIoT and analytics, implemented using edge automation products and practices, are an evolutionary way of performing common business optimization undertakings. Using a data-driven approach when undertaking manufacturing process improvement steps is similar to traditional methods but is much faster and more efficient.
We want to hear from you! Please send us your comments and questions about this topic to InTechmagazine@isa.org.