Improving analytics to meet process industry demands
Moving analyzers from the lab to a field-based shelter to the pipe will cut costs, reduce maintenance, and improve operations.
By Marcus Trygstad
Creating products from raw materials in pharmaceutical, petrochemical, and other process plants depends on intricate process systems executing numerous steps or unit operations. To ensure efficiency, quality, and safety, precise measurements must be taken during the various stages of production.
These measurements have typically been made using instruments and analyzers. These devices supply information to plant personnel in control rooms and laboratories. Instruments are used to measure basic parameters, such as pressure, temperature, flow, and level. Sophisticated analyzers make more complex measurements concerning the composition of one or more components in a liquid or gas chemical stream.
Both instrument and analyzer data are used as inputs to automation systems for the purpose of monitoring and controlling the process.
Instrument technology has been used for decades and is highly refined, enabling these devices to be directly installed in pipes and vessels, delivering reliable readings with minimal maintenance.
Analyzers for some basic measurements have advanced to the level of instruments in terms of ease of use and reliability. In most cases, analyzers and related sample systems remain quite complex in terms of installation, maintenance, and operations – but recent advancements are changing this paradigm.
Instruments generally measure basic intrinsic properties, such as temperature, pressure, or density – or extrinsic properties like level, mass, and flow. Generally, their response function is a single variable or univariate intensity reading, which may involve a zero and span calibration. This is the simplest form of quantitative analysis used in process plants because it measures a single variable and its attributes.
Simple specialized online sensors are typically available in a sensor-transmitter configuration in which a sensor module interfaces directly to a process. Commonly, the sensor connects to a hardened transmitter that is field-installed. The transmitter usually has a local display containing an operator keypad to facilitate configuration, calibration, and diagnostics. The transmitter may be physically separated from the sensor or integrated into it. Examples are simple sensors that measure conductivity, pH, hydrogen, and other parameters.
Advanced analyzers are much more complex, having a multivariate response function, such as a plot of intensity versus time, like a chromatogram from a gas chromatograph. The response function can also be a plot of intensity versus frequency/wavelength, such as a spectrum recorded by a Fourier Transform Infrared, Near-Infrared (NIR), or Raman spectrometer. Typically, such advanced analyzers require sample systems that condition and control characteristics of the sample stream delivered to the analyzer, such as temperature, pressure, flow, and particle loading.
Photometers and continuous emissions monitoring systems reside between simple and advanced analyzers, having a simple response function but often requiring some level of sample conditioning.
In decades past, analytical measurements were performed offline. This required taking a sample from the process, and then analyzing it in a laboratory or other location, causing significant delays and often degrading accuracy.
Over time, the process industries began to see the value of online analysis. In addition to saving time, online analysis allows better control of the process – improving quality and throughput, while reducing waste and cutting energy use. Using online analysis, abnormalities can be remedied through feedback and feed forward control, allowing corrections to be made much faster. These and other advantages of online analysis are listed in Table 1.
Information supplied by analyzers about the properties or composition of unit feed streams, product streams, or within-unit process streams can also enable companies to meet federal and state regulations, maximize catalyst life, minimize unit maintenance, and more.
The laboratory origins of online process analyzers
Most online process analyzers on the market today originated in laboratories. They basically represent analytical lab technology that has been packaged to operate unattended in a process environment. This explains the common requirement for sample conditioning, similar to the work of technicians in the laboratory.
The laboratory paradigm often carries through to the environment where process analyzers are implemented, such as the shelter or shed in place of a laboratory, with internal environmental conditions often closely controlled. Like their lab counterparts, these process analyzers generally require a high level of care to maintain reliability and accuracy. Analyzer technicians working in shelters ensure the continued reliable operation of process analyzers by performing maintenance, calibration, and consumables replenishment.
A sample conditioning system (SCS) is typically used to divert, deliver, and return a small quantity of gas or liquid from the process stream to the analyzer. SCS operation is a key determinant of analyzer reliability, but these systems present their own challenges in terms of maintenance and operation.
Commonly, the analyzer must be packaged for strict area classification requirements to ensure, for example, that the analyzer system cannot serve as an ignition source. Furthermore, the sheds or shelters often need gas detectors to protect personnel by sensing dangerously low levels of oxygen – or the presence of hydrocarbons, hydrogen, or other explosive gases.
A modern SCS is often expensive and can be more complex to design, install, and operate than its associated analyzers. Instead of operators capturing samples and delivering them to the laboratory, sample probes inserted into a pipe, vessel, or reactor continuously obtain a representative process sample. Samples are delivered to analyzers and returned to the process via tubing spanning from a few meters to a few hundred meters in length.
Pumps are often required to provide the pressure necessary for the sample to flow. The sample transfer lines are referred to as “fast loops” because the sample flows quickly through them, minimizing the lag between removal through the sample probe and delivery to the analyzers. The sample lines are then routed back to the point from which the sample was removed to quickly re-inject the sample into the process.
Lines may need to be heat traced to keep the viscosity of liquid samples low, or to prevent condensation in the case of gas samples. Once at the location of the analyzers, samples may require filtering and heating or cooling before being sent through smaller lines to the analyzers.
For calibration and validation, valves in the SCS open and close to select and deliver to the analyzers the required standard or validated reference sample liquids or gases contained in cylinders or tanks. These procedures must be performed on a periodic basis to maintain accuracy and meet regulatory requirements.
The established paradigm for process analytics is based on the packaging of laboratory analyzer technologies for field use, for installation in a shelter, and for interfaces to the process through elaborate sampling systems that transport and condition samples for analysis. This paradigm delivers reliable measurements, but at a high initial investment, and high ongoing support cost over the long-term to calibrate and maintain the analyzers and the sampling infrastructure. Though this has served the industry well, many have asked if a better way is available. And, if so, what is it? The old paradigm suggests three keys to improving the future of process analytics: standardized sampling systems, improved analyzers, and pipe-centric installation.
Standardized sampling systems
In 2000, the New Sampling/Sensor Initiative (NeSSI) took direct aim at sampling systems. Sponsored by the Center for Process Analysis and Control at the University of Washington, Seattle, NeSSI represents collaboration between users, vendors, and researchers. Its goal is to facilitate the acceptance and implementation of miniature, modular sample systems. This has resulted in the ISA-76 standard and promotion of smart analytical systems through close integration of flow, pressure, and temperature with analytical sensors.
Traditional analyzer installations require complex sampling systems with custom-designed tubing, gauges, and instruments. NeSSI-based sampling systems are much simpler and more standardized, and also have a much smaller footprint. Perhaps more important, however, is their improved reliability and corresponding reduction in total cost of ownership, facilitated through smart components that permit remote assessment of sampling system.
As significant as these benefits are, perhaps the enduring legacy of NeSSI may be that it fostered the development of miniaturized sensors that downmount onto NeSSI sampling substrates. Although not all current or future sensor technologies can or need to interface to the process through NeSSI substrates, the market can be expected to embrace those that do. For ultimately, the driver behind NeSSI and related sensor development is to lower the total cost of ownership through improved ease of implementation and greater reliability.
A common characteristic of the new generation of emerging analytical technologies is they are not merely automated incarnations of offline analyzers. Rather, they are designed from the ground up for implementation on, at, or near the process, employing process interfaces optimized for a given measurement application. As with NeSSI sampling systems, new sensor technologies incorporate diagnostic routines to automatically validate their proper function and to answer the most basic question asked by process engineers and operators, namely: “Can I trust this reading?”
These improvements also reduce the need for routine, in-person inspection, while identifying the progressive degradation of performance of system elements, which enables targeted preventative maintenance. In the event of unexpected component failure, information from smart analyzer systems can enable remote problem diagnosis, speeding, and simplifying repair.
Smart, compact, reliable, self-validating analyzers, and sampling systems certainly offer significant value over traditional technologies. However, they merely represent an improvement upon the old paradigm whose predominant, visible characteristic is shelter-centricity.
A revolution in process analytics will occur as that paradigm is progressively abandoned in favor of one wherein analyzers are installed at or near the sample tap. This pipe-centric approach eliminates the need for expensive shelters and minimizes or eliminates the associated SCS used for transporting process sample streams.
Remote validation of the proper function and operation of sensor and sampling systems is critically important in the pipe-centric paradigm, as it reduces the need for routine, in-person inspection by identifying progressive degradation of performance and enabling targeted preventative maintenance. Whether in or out of a shelter, advances in analyzer and sampling technology are critical for improving measurement reliability and reducing the total cost of ownership.
Although improved sensor and sampling technology alone are not revolutionary, they are the basis for the new paradigm. For, apart from greatly improved reliability and ease of maintenance, analyzer systems deployed at or near the sample tap outside of shelters may often fail due to neglect. Thus, migration of the measurement from shelter to pipe is futile without the development of sample interfaces that are reliable and smart. They, in turn, will be pointless if not applied to sensor technologies that are correspondingly reliable and smart.
New analyzer technologies: specific and robust
New online analyzer technologies perform detection and measurement based on the specificity of the physico-chemical sensor toward the property or analyte of interest. Present-day examples include solid-state hydrogen sensors and laser-based technologies such as spectrometers employing tunable diode lasers, quantum cascade lasers and cavity ringdown lasers. The ancestral progenitor of such high-tech sensors is the pH electrode.
Though the analysis of the spectral data normally depends on sophisticated mathematics, the analyzer output is a measured response of the sensor to an analyte that is unique, selective, and direct. The result is a first-principles response based on fundamental physical and chemical principles. In most cases, this enables specialized online analyzers to be mounted at the pipe or vessel, minimizing or eliminating the need for a SCS.
Specialized online analyzers are also usually much simpler and less costly than complex gas chromatograph (GC) and NIR analyzers in terms of purchase, installation, operation, and maintenance (Table 2). In the best case, these specialized online analyzers will evolve and become as easy to implement, operate, and maintain as standard pressure, level, flow, and temperature instruments.
The selectivity achieved by modern advanced GC and NIR analyzers is achieved through sophisticated sample handling in the case of a GC, or deduced from sophisticated mathematical modeling when using NIR analyzers.
Selectivity of specialized online analyzers yields a reduction in sensor complexity and size of the sensor itself. It also often eliminates the need for a SCS, or at least substantially reduces its cost, size, and complexity.
The evolution of process analytics and related new technologies is blurring the distinctions between instruments and analyzers, in large part due to the progressive advancement of analyzers toward specificity/selectivity of their sensors in response to a target analyte. These improvements will lead to more widespread use of analyzers, with accompanying improvement in process plant operations.
ABOUT THE AUTHOR
Marcus Trygstad (firstname.lastname@example.org) is business development manager, Advanced Analytical Soutions, Yokogawa Corporation of America. Before joining Yokogawa, Trygstad held positions in product development and product management at Emerson, ABB, and Invensys, as well as with several small technology companies. He focuses on developing reliable online measurement strategies for refining and petrochemical processing by leveraging conventional technologies in unconventional ways. Although he has a strong orientation toward applied spectroscopy, Trygstad has also developed multivariable analytical strategies based on chromatographic and simple transmitter-sensor technologies. He is the principal author on eight patents and patent applications.