Temperature measurement and calibration: what every instrument technician should know
Friday, 11 December, 2009
Temperature may be the most commonly measured physical parameter. Yet there have never been so many ways to measure it as there are today. With so many options it’s natural to have a few questions. How do I measure temperature? How accurate is my measurement? What temperature range is required? What type of device best measures temperature?
These are very common questions when confronted with the need to measure temperature. A variety of measurement devices may be used for temperature: liquid in glass thermometers (LIG), thermocouples (TCs), thermistors, resistance temperature detectors (RTDs), platinum resistance thermometers (PRTs) and standard platinum resistance thermometers (SPRTs).
How do I measure temperature?
After inserting a temperature sensor into the area to be measured, it takes time before the temperature reading is stabilised. For the thermometer to stabilise at the correct temperature, the probe must be sufficiently immersed. Some thermometers require more immersion depth than others, and most precision thermometers require 10 to 15 cm when inserted into a liquid or snug-fitting well, depending on the diameter of the probe. Best results in terms of accuracy and stabilisation time occur when the probe can be inserted into a stirred liquid, because air pockets between probes and solid surfaces lead to longer stabilisation times and require more immersion than would be required in a liquid. Specialised thermometers are needed for measuring temperatures on surfaces and for situations where the probe cable will be exposed to extreme temperatures.
Testing the energy performance of steam systems, cooling towers, heat exchangers, refrigeration systems, turbines, and internal and external combustion engines requires measuring differences between inlet and outlet temperatures. Sometimes these measurements have to be made from outside the pipe using thermocouples, thin-film sensors or infrared temperature measurements. However, the best accuracy will be achieved when a thermowell has been properly installed in both the inlet and outlet pipes so that a probe can be inserted and sufficiently immersed. Because pipe diameters are sometimes a limiting factor for immersion, the best location for a thermowell is at an elbow in the piping so that the probe can be inserted parallel to fluid flow with as much immersion depth as needed.
Calibration
Often devices that measure and display temperature need to be verified or calibrated against a reference thermometer. Accuracy is improved when the distance between the two thermometers is reduced, and best practice is to align the centres of the sensing elements of the reference thermometer and the device under test. Be aware that the location of the centre of the sensor depends on the sensor type and model.
A common method of calibrating temperature sensors is to remove them from where they are installed and place them in a dry-well calibrator or a micro-bath. These calibrators provide a stable temperature environment over a range of temperatures to compare the thermometer under test to the calibrator display or to a reference thermometer for more accuracy.
Alternatively, temperature sensors may be calibrated or verified without removing them from their installed location. Usually this is done by inserting a reference thermometer into a thermowell, immersion well or thermometer pocket installed next to the thermometer to be tested.
How much accuracy is needed?
Thermometers that are specified by design engineers for temperature monitoring or control should include accuracy data in their specifications. A design engineer, quality engineer or metrologist should also specify the calibration requirements. However, it is not uncommon for instrument technicians to receive a calibration job and little or no information about calibration requirements.
A common calibration strategy is to reduce mistakes by keeping the uncertainty of the calibration standards to a low percentage of the accuracy of the thermometer under test. This percentage is usually described as a test uncertainty ratio (TUR). For example, the 4:1 TUR used by the military and other industries keeps the collective uncertainty of the calibration standards to 25% of the accuracy of the thermometer under test. For comparison, a TUR of 2:1 means that the uncertainty is 50% of the thermometer accuracy, and if the reference thermometer has the same accuracy as the thermometer under test, then the TUR is 1:1. The latter TUR is never recommended for calibration and would produce unreliable results.
With a more accurate calibration standard you can identify more actual out-of-tolerance field devices. Table 1 illustrates the expected frequency of making mistakes at various TURs. The table example is based on a scenario where 950 of 1000 instruments are truly in tolerance. For example, if all 1000 are calibrated with a 2:1 TUR then we expect that 926 will be found in tolerance (accepted), 12 of which are truly out of tolerance (false accept). Of the 74 expected to be rejected, 41 are expected to be truly in tolerance (false reject).
TUR | Accepted | False accept | Rejected | False reject |
1:1 | 843 | 17 | 157 | 128 |
2:1 | 925 | 12 | 75 | 41 |
3:1 | 941 | 9 | 59 | 22 |
4:1 | 947 | 8 | 53 | 15 |
Thermometer probe types
There have never been as many temperature sensor (probe type) choices available for your measurements as there are today. With so many choices, the task can become time consuming and difficult without some help. The most important factors are temperature range, accuracy and cost. Table 2 illustrates the trade-offs among these factors for several thermometer types.
Temperature range | Accuracy | Cost | |
Noble-metal thermocouple
(special tolerances) |
R, S: -50 to 1760 °C | > ±0.6 °C | Med |
Base-metal thermocouple
(special tolerances) |
B: 0 to 1280 °C | ±0.25% | Low |
E: -270 to 1000 °C | > ±1 °C | Low | |
J: -210 to 1200 °C | > ±1.1 °C | Low | |
K: -270 to 1370 °C | > ±1.1 °C | Low | |
N: -270 to 1300 °C | > ±1.1 °C | Low | |
T: -270 to 400 °C | > ±0.5 °C | Low | |
PRTs and SPRTs | Industrial: -80 to 480 °C | ±0.05 - 0.1 °C | Low - Med |
Reference: -200 to 660 °C | ±0.001 - 0.02 °C | Med - High | |
High temp: 0 to 1000 °C | ±0.01 - 0.02 °C | Med - High | |
Precision thermistors | 0 to 100 °C | ±0.002 °C | Med |
Thermocouples
Thermocouples are temperature sensors that measure temperature by generating a small voltage signal proportional to the temperature difference between the junctions of two dissimilar metals. One junction (the measurement junction) is typically encased in a sensor probe at the point of measurement; the other junction (the reference junction) is typically connected to the measuring instrument. The measurement instrument measures two things: the voltage signal and the reference junction temperature. From those two things the instrument computes the temperature at the measuring end of the probe. It is important to note that the voltage generated by the sensor is not based on the absolute temperature of the measurement junction, but rather a temperature difference between the measurement junction and the reference junction (see Figure 1).
|
Thermocouple types are distinguished by the metals used in each leg of the thermocouple. Noble metal thermocouples all contain platinum in one leg of the thermocouple and include Type S, Type R, Au/Pt, and Pt/Pd. Base metal thermocouples include Type B, Type E, Type J, Type K, Type N, and Type T. These thermocouples come in two accuracy classes: standard limits of error and special limits of error. The special limits of error thermocouples are the most accurate.
Reference junction compensation is one of the most significant contributors to the accuracy of a thermocouple measurement. Thermocouple tables like those in NIST monograph 175 are based on a reference junction temperature of 0 °C. Although external reference junctions can be used to achieve this with an ice bath, thermocouple wire is usually connected directly to the thermocouple readout binding posts at room temperature. Automatic reference junction compensation is needed to compensate for the deviation from 0 °C.
In Figure 2, the thermocouple wire meets with the copper wire at the binding posts of the meter forming the reference junction (J). The temperature in the region surrounding the binding posts (TJ) is usually measured by a thermistor, and automatic reference junction compensation is accomplished by measuring the difference from 0 °C at the binding posts (TJ) and compensating for it digitally. The accuracy of this measurement has a significant impact on the accuracy of the overall temperature measurement.
|
Resistance-based temperature measurement An RTD is a temperature sensing element that changes resistance with temperature. There are several kinds of RTDs: RTD sensing elements include coils of platinum wire (PRT), nickel wire, copper wire, thin films and more. Another resistance-based sensor is the thermistor, which is made of semiconducting material. Figure 3 illustrates a simple 2-wire measurement circuit, where the sensing element is labelled RT and the lead-wires have finite resistances labelled RL1 and RL2. When current passes through the sensor, the environment is going to get a little warmer because of power dissipation. The more resistance or current there is, the more power gets dissipated (P = I2R). The self-heating will be higher in air because the heat will not flow away as efficiently as it would in a stirred fluid. |
|
Self-heating errors can be minimised by using the same level of current used during calibration. Using the correct current is particularly important in thermistors because they can have very large resistances, causing greater self-heating. Current reversal is another technique used in resistance measurements to eliminate errors associated with thermal EMFs. Thermal EMFs are unwanted voltages in a resistance measurement circuit caused by the same principle that produces a voltage in thermocouples. The measurement is made with the current flowing in one direction and then again with the current flowing in the other direction, the thermoelectric EMFs being removed by averaging the results of both sets of measurements. This technique is used by many modern instruments and improves measurement stability, reducing errors that are common in other instruments. Platinum resistance thermometers A platinum resistance thermometer (PRT) element contains coils of highly pure platinum wire. The resistance of a PRT element varies more linearly with temperature than any other temperature sensor, and a standard platinum resistance thermometer (SPRT) is the most accurate temperature sensor available. |
|
Temperature measurement with a PRT requires correlating the resistance of the sensing element with temperature using the correct equations and coefficients. Fortunately, most thermometer readout devices have support for these equations, so the calculations are handled automatically. Examples include ITS-90 equations, Callendar Van Dusen (CVD) equations and polynomial equations. Best performance with PRTs can usually be achieved with the ITS-90 equations, but older readouts and uncalibrated PRTs may use CVD equations.
Thermistors
A thermistor element is made of semiconductor material and has an electrical resistance that varies nonlinearly with temperature. Thermistors are widely used because of their sensitivity, small size, ruggedness and low cost. Inexpensive thermistors are commonly used in electronics applications while precision thermistors are calibration standards rivalling the accuracy of SPRTs.
Thermometer accuracy, repeatability and resolution
Two important components of accuracy are repeatability and resolution. They should be considered along with other factors affecting accuracy. Repeatability refers to the consistency of repeated measurements, and regular calibration is helpful for establishing instrument repeatability.
A digital thermometer should be chosen with sufficient resolution to achieve the desired accuracy and resolution on a digital thermometer is often user selectable; however, resolution is not the same thing as accuracy - it is merely a limiting factor in the accuracy. In a liquid and glass or dial thermometer, the resolution may be the most important factor, apart from calibration, that affects the accuracy.
Accuracy specifications can be structured several ways. Specifications will usually be divided into ranges and may be given in base units of temperature, resistance or voltage. Simple specifications will either be a variable or fixed value, and complex specifications will be a combination of both. When variable-type specifications are used, the allowed error increases when the magnitude of the reading increases - examples include percent of reading or parts per million (PPM). On the other hand, fixed value specifications remain constant over a range. Examples include percent of scale or span, and numeric constants.
You can convert specifications in base units of resistance or voltage to temperature, but the conversion depends on the sensitivity of the temperature sensor. For example, a change in temperature of 1 °C will result in a 0.4 Ω change in resistance for a 100 Ω PRT and result in a 0.1 Ω change for a 25 Ω SPRT, but it may cause a 1000 Ω change in a thermistor. Consequently, a meter with ±1 Ω accuracy will be most accurate for those sensors with the highest temperature sensitivity.
Each of the thermometers listed in Table 2 requires a digital readout. The best digital readout is one that is designed specifically for temperature measurement. Table 3 lists some of the requirements for good electronic thermometer readouts.
Readout device | Requirements |
Thermocouple readout | Good accuracy from -10 to 100 mV |
Low noise floor | |
Very low thermal EMFs | |
Good referenec junction compensation | |
PRT readout | Good accuracy from 0 to 400 Ω |
Current reversal | |
Four-wire resistance measurement | |
1 mA excitation current | |
Thermistor readout | Reasonable accuracy from 150 Ω to 500 Ω |
Better accuracy required below 1000 Ω | |
2-10 µA excitation current |
Fluke Australia Pty Ltd
www.fluke.com.au
Automating water management for improved mining sustainability
Water is fast becoming a precious commodity, and mine water management has shifted from an...
Ultrasonic and radar level technologies: bringing clarity to the water and wastewater industry
There is room for both ultrasonic and radar level measurement technologies in the water and...
Choosing an infrared temperature sensor
There are a number of factors that need to be considered when selecting an infrared temperature...