1 May 2005

Calibration calisthenics

Typically, you check calibration of an instrument at several points throughout the calibration range of the instrument. The calibration range is the region between the limits within which you measure receive or transmit a quantity. You express it by stating the lower and upper range values.

Zero and span values define the limits. The zero value is the lower end of the range. Span is defined as the algebraic difference between the upper and lower range values. The calibration range may differ from the instrument range, which refers to the capability of the instrument. An electronic pressure transmitter may have a nameplate instrument range of 0-750 pounds per square inch, gauge (psig) and output of 4-to-20 milliamps (mA). However, the engineer has determined he should calibrate the instrument for 0-to-300 psig = 4-to-20 mA. In this example, the zero input value is 0 psig, and zero output value is 4 mA. The input span is 300 psig, and the output span is 16 mA.

You should perform every calibration to a specified tolerance. People often use the terms tolerance and accuracy incorrectly. You'll find correct definitions in ISA's The Automation, Systems, and Instrumentation Dictionary. Accuracy is the ration of the error to the full scale output or the ration of the error to the output, expressed in percent span or percent reading, respectively. Tolerance is permissible deviation from a specified value. You can express tolerance in measurement units, percent of span, or percent of reading. But be aware of subtle differences in these two terms. You might specify the tolerance in measurement units and use it for the calibration requirements performed at your facility. By specifying an actual value, you limit mistakes caused by calculating percentages of span or reading. You could also specify tolerance in the units measured for the calibration.

The term accuracy ration used to describe the relationship between the accuracy of the test standard and the accuracy of the instrument under test. Now a good rule is to ensure an accuracy ration of 4:1 when performing calibrations. This means the instrument or standard you're using should be four times more accurate than the instrument you're checking. Therefore, the test equipment you use to calibrate the process instrument should be four times more accurate than the process instrument, and the laboratory standard you use to calibrate the field standard should be four times more accurate than the field standard, and so on.

Loop calibration vs. individual instrument calibration

An individual instrument calibration is a calibration you perform only on one instrument. Don't connect the input and output. Apply a known source to the input, and measure the output at various data points throughout the calibration range. Adjust the instrument, if necessary, and check calibration. Perform a loop calibration from the sensor to all loop indications with all the loop components connected. You'd insert a temperature sensor connected to a temperature transmitter in a temperature bath/block. (Note: You'd either calibrate the bath/block or use a temperature standard in the bath/block for traceability.) Adjust the temperature of the bath/block to each data point required to perform the calibration. Record all local and remote indication. Also record the transmitter output. If all indications and transmitter output are within tolerance, the loop is within tolerance. If any loop component is not within tolerance, then perform a calibration on that instrument. Do not adjust a transmitter to correct a remote indication.

Bench calibration vs. field calibration

You'd perform a bench calibration in the shop on the bench with power supplied from an external source, if required. You may perform bench calibrations on receiving new instruments before installation. This assures you receive the instrument undamaged. It also allows you to configure it in a more favorable environment. Some companies perform the periodic calibration on the bench. In this case, remove the process instrument from service, disconnect it, and take it to the shop for calibration. In some instances, install a spare in its place to minimize the process downtime. You might send critical flow sensors out to a certified flow calibration facility. To prevent shutting the process down for several weeks, install a replacement flow sensor.

You'd perform field calibrations in-situ, or in place, as installed. Don't remove the instrument you're calibrating from the installed location. You may perform field calibrations after installation to ensure proper connections and configuration. You're more likely to perform periodic calibrations in the field, in the environment in which the instrument operates. If you install the instrument in a harsh environment, you calibrate it for that environment. If you remove the instrument for a bench calibration and then return it, you might introduce some error due to the ambient conditions and orientation.

SOURCE: Calibration: A Technician's Guide, by Mike Cable