Design principles for complex process control
By Jean Vieille
I recently surveyed the on-site development of the control system for a large concentrated solar power station. This “first of a kind” facility offered many control challenges, such as balancing heat in 600-meter-long vaporizers, setting the dry-out point, and adjusting the mirrors’ position at centimeter precision on a 50-meter distant target.
In this “research-oriented” (rather than “engineering-oriented”) context, the control system was initially designed on assumptions that were sometimes defied by the reality of the facility and its integration in the actual environment. The facility could not run without the constant help of operators and automation engineers and delivered poor efficiency. Some progress was made by successive adjustment to the control strategy until it seemed to get worse and worse. Besides technical recommendations to reverse this tendency to make the control run properly, I found that the root causes of the control difficulties were a crippled organization trying to address concurrent control engineering and commissioning and failing to respect some principles, which I tried to work out to help the control team get back on track.
Designing control strategies is an exciting and challenging specialty within control engineering. The automation engineer of course needs a solid and diverse technical background that extends from physical measurement to actuation; from feedback control to model-based control; from combinatory to sequential logic; from chemistry to thermodynamics; and to many computation, networking, and data management technologies. When it comes to designing a control strategy, this knowledge is essential, but not sufficient, to succeed in delivering a control system that makes the process efficient, safe, and reliable.
The following recommendations are based on lifelong experience in process control, facing the obvious “mistakes” I found during this survey. This is not a comprehensive review, but I hope it will benefit young control engineers. Those more experienced will probably be reminded of their early struggles.
This article is about common sense when applying simple mathematical control functions, algebra, and differential calculus, including proportional, integral, derivative (PID) controllers that are readily available in any process controller. One might object that formal mathematical methods exist to handle complex process controls that eliminate old-fashioned PID-based control. In one particular project, we benefited from a powerful simulation platform and a sophisticated model-based control strategy. The simulator could not keep up with process variance, and was operated by a separate team. The developers recommended not to use model-based control; they realized that this theoretical work needs considerable effort to catch up with the process complexity and requires numerous, perfect measurements that appeared to be problematic.
A complex process is not a complicated process with a large number of measurements and actuators. It is complex because its variables are highly coupled and correlated. Typical examples are steam generators, distillation columns, and fiber extrusion. It is when this systemic complexity increases with nonlinearity and long, high-order time constants that control becomes particularly challenging. Though the following principles are reasonable in any circumstances, they specifically apply to such contexts where disrespect generally compromises the success of control.
Robustness, inner stability preference, process symbiotic attitude
As much as possible, control should use the natural dynamics of the system; it should not try to tightly drive process values that are dependent on other variables that are already under control. Double control is the recipe for instability and unpredictable behavior.
Control engineers must first focus on gaining a deep understanding of the process behavior. For me, this includes staring at multitrend diagrams for hours until I get a sound explanation of an odd spike. It also keeps me from being overconfident that I have ever totally got it. The simple laws of physics produce puzzling outcomes when playing together in a bounded spatiotemporal facility.
Least use of controllers
Every controller increases the system order and the risk of instability—the fewer controllers, the better.
Least use of instrumentation
Instruments are the weakest part of control. They are subject to breakdown and errors—the fewer instruments, the more control is robust and reliable.
Use cascade control in two circumstances:
- To improve the linearity of the process response. For example, a temperature controller gives a set point to a slave flow controller instead of directly driving the gas control valve.
- To split the time constant of a slow process. If the time constant of the main loop is not significantly reduced by the cascade control, or if the linearity of the process is degraded by the slave loop (that happens), a cascade slave loop must be avoided.
As cascade control increases the number of controllers, decreasing the process stability, the default option is none.
Override control allows several process variables and associated controllers to drive the same manipulated variable, depending on the process conditions (e.g., controlling the pressure during startup, switching to flow control in steady state). The same can be done using a single controller by switching process variable errors instead.
Override control may cause instability because of the nonlinearity provoked by the switchover. Preventing switchover bumps may be tricky.
Unless the process has the same dynamics for each process variable, a separate controller should be used to allow their specific tuning.
Multiple input control
Multiple input control attempts to manipulate a single process variable to maintain several controlled variables within appropriate ranges. For example, the boiler steam outlet valve may be computed based on temperatures at the economizer outlet, near the dry-out point, at the super-heater first section, at the actual and target flow, at the pressure set point and measure, or at diverse tuning parameters. Do not laugh; I have seen it!
This looks like an appealing approach for mathematician control engineers as long as one thinks about each variable independently. Because of the coupling of these variables (as in complex processes), the process behavior may become unpredictable and unstable. This beauty needs to be considered carefully and justified.
Multiple controlled input
Multiple controlled input is the opposite of the previous type of control. The same controlled variable drives several control loops and manipulated variables. The superheat temperature of a once-through boiler might be controlled by adjusting the inlet water flow (for slow adjustment of the water inventory and dry-out point) and the generator’s power (for rapid adjustment of the temperature within the expected range).
Make sure that enough distance exists between the respective operating domains, and determine the dynamics to prevent resonances.
Process modeling and characterization allows engineers to calculate and implement transfer functions that can correct a disturbance by doing the correct math to elaborate a manipulated variable. This is an open loop (disturbance > math > manipulated variable > PROCESS > process variable), compared to a closed (i.e., PID) feedback loop (process variable > math > manipulated variable > PROCESS > process variable).
Model-based/feedforward control provides an inherently stable and fast response to disturbances. It anticipates process variable deviation by compensating for the disturbance effect before it occurs. This works only if the model is robust and stable enough for a globally positive impact on control performance. If these conditions are not met, control will suffer.
Negentropic, lean development
Entropy is a measure of disorder, a natural tendency of the physical world to degenerate over time. Entropy is linked to the future, not the past. Despite this apparently lethal fate, the world keeps progressing thanks to the negentropic, which is the reverse-entropy power of properly handled information—one of the basic aspects of life.
Control development does not escape the rule: it naturally tends toward disorder by successive fixes that only address the symptoms of issues, not the root causes, and by not cleaning up useless code.
The progress of control development can be quantitatively shown by the evolution of the code volume. Once all features have been provided, as soon as commissioning and optimization start, the code volume should decrease or, at worse, keep steady. Either an existing feature will be simpler after an update or a new feature will replace one of equal or greater complexity.
Lean thinking has been shaping mentalities in manufacturing operations for the past three decades; control engineers might follow this idea too. To summarize all the above, if a control strategy idea makes the system simpler, it is possibly heading in the right direction. When it makes it more complicated, the idea is probably wrong.