1 April 2006
Software: A gentler step-up
By Carlos Moreno
The technology reproduces the process of a person learning how to perform a repetitive task better
Oftentimes production improvements are formulaic and yield to mathematical solutions.
The solution can be sufficiently structured so there are generic software tools, which are easy to reapply to a variety of products and processes.
The three optimization approaches that can apply online are First Principle Models, Sequential Empirical Optimi-zation, and Neural Networks.
In addition, there is the older Design of Experiments, which cannot apply online.
- There is a core piece of software that implements the solution for all applications.
- Each application requires the formulation of a Scope for each application to drive the software.
- There are prediction models of all the outputs necessary to evaluate performance each as a function of all the inputs, adjusted and uncontrolled.
- There is an optimization engine (a search mechanism of the prediction models) to find the best combination of adjustment decisions that predict the best performance in the area where the models are relatively valid. Thus, the search is as effective as the domain and quality of the models.
First Principle Models: Experts in the process and in modeling create the models up-front. The models are based on first principles simulations (physical, chemical, etc.) and are complemented with some coefficients determined empirically. Then they are validated empirically in the region of combination of inputs (adjusted and uncontrolled) of perceived interest.
This solution applies, as well, to proposing standard operating procedures (SOP) values and designing control logic.
Sequential Empirical Optimization (SEO): Up front, it requires neither models nor past operating data. The user can include calculations of certain variables as a function of others, such as an efficiency calculation, a total cost or profit, etc.
Adjustments start at the current practice on the production floor and the cycles of: make new adjustments, collect operating data, update models, and generate new advice for adjustments converge rather quickly towards the optimum. The continually updated models based on data are empirical.
The technology reproduces conceptually the process of a person learning how to perform a repetitive task better as he/she gets more experience.
Neural Networks (NN): Again, experts in the process and NN create the empirical models in advance. They start by collecting experimental data with large coverage of combinations of adjusted and uncontrolled inputs, and then they fit the very versatile NN models to it, which are valid in the region covered by sufficient data.
The models cannot rise from historical data alone because it is unlikely that past data was around the optimum.
Design of Experiments (DOE): This, together with First Principle Models, is the oldest method used to achieve optimization.
Production runs, with highly structured experimental adjustments, take place to collect operating data, which one then fits to a mathematical model, such as the quadratic equation. It has the advantage that when experimental runs are orthogonally distributed, there is a clear understanding of the individual effects of inputs on outputs, which is very helpful when wishing to understand the behavior of an unknown aspect of the process.
DOE, like other solutions that depend on creating up-front models, cannot define the optimum if it is outside the area where they are valid, where production runs with experimental adjustments ran to collect operating data. When this situation transpires, then a new DOE needs to run.
About the Author
Carlos Moreno (email@example.com) is an ISA member and the CEO of Ultramax Corporation. He has a Ph.D. in industrial engineering. Moreno is a senior member of IIE (Institute of Industrial Engineers) and a Fellow of ASQ (American Society for Quality).