01 April 2003
An exacting challenge
New process allows for tighter, more accurate readings.
By J. R. Madden
While it may seem primitive and a far cry from today's technologically advanced equipment, the first true temperature measurement came when man decided to put his hand to fire, felt the heat, and decided it would be a better thing to keep his hand out of the middle of the inferno blazing in front of him. From that point on, temperature technology evolved, and the industry credited contributions from such men as Galileo, Gabriel Fahrenheit, Anders Celsius, and Lord Kelvin.
Today, measuring temperature has come a long way and continues to improve to meet industry demands. Filled systems have led to electronic models; 8-bit circuit designs of the 1960s have given way to the 22-bit designs of the twenty-first century; even transmitters that once maintained 3–15 pounds per square inch, gauge, pneumatic signals have transformed into standard 4–20 mA analog signals; and now digital communication protocols are making significant headway.
Close to perfection
Despite the considerable improvements and sophistication of the technologies employed today, engineers are still facing significant challenges to attain the most accurate readings. The source of their uncertainties is unfortunately not limited to one factor but is rather derived from almost every component of the temperature measurement process.
Temperature is one of the most common process measurements, but it is also one of the most misconstrued. Engineers must contend with input accuracy, electronic noise, cold junction, and the many other factors that affect the precision of their temperature transmitters. It is also important not to overlook the inaccuracies caused by the sensors, for a temperature sensor is most likely the biggest source of the inaccuracies and can ultimately determine the overall effectiveness of a temperature system.
When selecting a sensor, whether you are seeking a thermocouple, thermistor, or RTD, you should consult a myriad of experts to determine the best solution for your particular application and level of accuracy.
However, with accuracy as the primary goal, there are a few factors to consider.
Thermocouples are rugged, are simple to use, and offer a broad temperature range. However, the accuracy of a thermocouple is inherently off 4° to 8°F from the start. As the electro and metallurgical properties change over its life, its accuracy decreases. Harsh field environments and constant shifts between heating and cooling push the unit to the extents of its range and advance the aging process even faster. With few exceptions, a resistance temperature detector (RTD)—in particular, a four-wire RTD—will provide a much more accurate, stable temperature measurement than a thermocouple at temperatures up to 950°F.
|Sensor-to-transmitter trimming section|
|Trimming a transmitter allows a complete range to be monitored while placing measurement emphasis on a specific segment of the range that is most critical to the process.|
The other side
Inaccuracies from an RTD may come from lead wire imbalances due to terminal block corrosion, connector corrosion, extension wire splices, and variations in lead length, the gauge, and thermo shock. The result can be significant because for every ohm of imbalance to an RTD's lead wires, as much as 2.5°C measurement error may occur.
Neither a two- nor three-wire RTD will compensate for differences in lead resistance or detect corrosion, although a three-wire RTD will compensate for lead wire length if each lead is exactly the same resistance. A four-wire RTD, used properly with a temperature transmitter that accepts a true four-wire RTD input, will compensate for unequal lead lengths as well as lead resistance and corrosion buildup.
Typically, the best measurements from a four-wire RTD are when the total RTD resistance value, corrosion, and wire resistance is less than 2,000 ohms. Costing about the same as a three-wire RTD, a four-wire RTD is an intelligent investment to increase the accuracy of the temperature system.
Compared with a thermocouple and RTD, a thermistor demonstrates the largest resistance change with temperature. However, with this high level of sensitivity comes a significant loss of linearity. Thermistor curves have not standardized to the degree that the curves of thermocouples and RTDs have.
With cost and accuracy in mind, why even bother putting a temperature transmitter in the equation at all? One may argue that using extension wires to carry sensor signals to the control room would cost less and essentially eliminate the unnecessary errors that a transmitter might introduce.
In the past, high costs associated with transmitters may have been a justifiable reason to forgo temperature transmitters, but today's transmitters, even those fully loaded with advanced capabilities, are comparable in cost to direct wiring strategies. In fact, when you take into account the potential problems caused by inaccurate readings, wiring becomes an even more costly option than using temperature transmitters. Using a temperature transmitter will foster the accuracy of the entire system, providing added value and savings over time.
For example, the presence of electromagnetic interference is a reality in the majority of process facilities. With two-way radios, heating elements, static discharge, and even fluorescent lighting, the effect to a process ranges from minor inconvenience to a complete process failure.
Most temperature transmitters are equipped with advanced electromagnetic interference/radio frequency interference protection, as well as other components to guard against troublesome environmental factors. On the other hand, the wires used to transfer RTD or thermocouple signals in a direct wiring configuration may actually act as an antenna, attracting plant noise that can quickly degrade both the signals and the measurement accuracy.
|Temperature transmitters vs. trimming section|
|Sensor extension wires carry low-level (ohm or millivolt) signals generated by a field-mounted RTD or thermocouple.|
A temperature transmitter amplifies and conditions the primary sensor signal, then carries it over a twisted-pair wire to the control room.
Fact or fiction?
Assessing the level of accuracy from a vendor's spec sheets can be a trying experience. Vendors use numerous methods to present the information.
Just check out the fine print. Some use the term "linearity" interchangeably with "accuracy." Sometimes specifications come in terms of a particular temperature range, temperature reading, or measurement span. Others include linearity and repeatability in their results and make assumptions about the ambient operating conditions of the application. The problem is that buyers rely on these documents to ensure each instrument has undergone the proper design, testing, and manufacturing for optimum accuracy.
To establish the accuracy of the system, add the accuracy rating of the sensor to the accuracy rating of the transmitter. Temperature transmitters use ideal sensor curves to convert a sensor's signal to a temperature reading. Though sensors inherently meet the curve, each unit will vary slightly from the stated specification—even high-precision sensors. Thus, error is inevitable. A user must consider all possible sources of error when determining accuracy. Errors from a sensor, coupled with errors from a transmitter, will yield a very inaccurate temperature measurement.
A high-quality temperature transmitter is capable of delivering accuracy ratings to within ±0.23°F (±0.13°C) when used with a common platinum RTD over a 200° span. However, a transmitter can be matched precisely to a sensor using sensor-to-transmitter trimming to achieve a higher level of accuracy. This technique provides an effective way for the user to compensate for the sensor's output deviation from the ideal curve by calibrating the temperature transmitter and sensor together as a unit, rather than as separate components. As a result, engineers can take advantage of better performance, greater reliability, and reduced maintenance costs from their processes.
Simply stated, temperature calibration involves comparing the output of a temperature measurement system with a known precision standard. During the process, the system notes the differences and then makes modifications so the instrument reflects the correct temperature.
Using the sensor-to-transmitter trimming technique, a user immerses the sensor in a temperature bath and keeps it at a stabilized temperature throughout the entire process. A user performs the calibration process under extremely controlled ambient conditions.
For temperatures under 230°F (110°C), a portable fluid bath is commonly used. For higher temperatures, a fixed fluid bath or a dry block calibrator with a standard certified probe to provide a reference point is used.
The transmitter captures two input readings from the sensor, which represent the upper and lower temperature values, and then stores them in its nonvolatile memory. Using these values, the transmitter will compensate for deviations between the sensor's stated linearization curve and its actual measured values.
You can trim some temperature transmitters to respond to two data points within the selected zero and span measurement range. This allows you to monitor a complete range while placing measurement emphasis on a specific segment of the range that is most critical to the process.
The capabilities of smart transmitters have made digital outputs the preferred signal over analog. Obtaining data directly from a digital signal eliminates potential errors caused by digital-to-analog (D/A) converters. Ideally, if the signal was read directly from the 20.5 bit analog-to-digital converter with a distributed control system or programmable logic controller, the accuracy would not be lost before the information is converted again with the D/A output converter.
Following the calibration process, the system generates a comprehensive report. This will detail the calibration results of the entire system, certify that the calibration measurements are traceable to an accredited metrology institute such as the National Institute of Standards and Technology, and verify that the method was conducted in a manner that adheres to the required calibration procedures established by the company and complies with industry standards. Using the sensor-to-transmitter trimming method, the typical accuracy for RTD systems is ±0.025°F (±0.014°C) and ±2°F (±1.1°C) for thermocouples.
Technology is constantly improving to respond to the growing needs of a plant. Though we are able to make temperature measurements faster and better than ever before, the challenge to achieve the most accurate measurement continues.
Accuracy is a complicated subject—the word itself is a misnomer. Accuracy is normally expressed as inaccuracy, meaning you won't find a manufacturer's spec sheet that boasts 99.9% accuracy for its temperature transmitter. Rather 0.1% error indicates the level of accuracy. In addition, there is no common terminology used, so comparing one manufacturer's specifications with another can be a difficult task. TT
Behind the byline
J. R. Madden has been an applications engineer at Moore Industries for more than 17 years. He has extensive experience in temperature, semiconductor, and pharmaceutical applications. His e-mail is JRMadden@miinet.com.