What are Infrared Temperature Sensors?

Table of Contents

Introduction
Measurement Principles
Theoretical Basis for IR Temperature Measurement
Infrared Thermometer Design and Construction
Two Color-Ratio Thermometry
Summary

Introduction

Temperature is measured by an infrared intemperature probe by detecting the infrared energy produced by all materials which are at temperatures higher than absolute zero, (0 °K). The most basic design includes a lens that helps focus the infrared (IR) energy on to a detector, and this detector in turn changes the energy into an electrical signal that can be shown in units of temperature after it is compensated for variation in ambient temperature. This configuration allows the temperature to be measured from a distance without making any contact with the object to be measured.

Therefore, the infrared temperature sensor serves as a practical tool for measuring temperature under conditions, where probe type sensors including thermocouples cannot be used or fail to produce accurate data for a number of reasons. Some typical circumstances are where the object is surrounded by an EM field, as in induction heating; where the object to be measured is moving; where the object is contained in a vacuum or other similar controlled atmosphere; or in applications where there is a need for a fast response.

Ever since the late nineteenth century, designs for an infrared thermometer (IRT) have very much existed and it was Charles A. Darling (1) who featured the various concepts by FŽry in his book "Pyrometry," published in 1911. However, the technology was not available until the 1930s to transform these concepts into viable measuring instruments. Since that period, the design has evolved considerably and a large amount of application and measurement expertise has built up. Today, the technique is popular and is extensively used in both research and industry.

Using non-mathematical terms, this article describes the theory on which the measurement technology is based and how this is employed to address the various application parameters confronting the intending user.

Measurement Principles

As stated before, all materials above absolute zero (0 °K) emit IR energy. Part of the Electromagnetic Spectrum, infrared radiation occupies frequencies between radio waves and visible light. The IR part of the spectrum covers wavelengths from 0.7 micrometers to 1000 micrometers (microns), as shown in Figure 1. Within the scope of this wave band, only frequencies of 0.7 microns to 20 microns are employed for practical and routine temperature measurement. The reason for this is the IR temperature sensor, which is presently available to industry is not sensitive enough to detect the small amounts of energy available at wavelengths further than 20 microns.

Figure 1. Infrared spectrum 0.7 to 1000 micrometers (microns) Electromagnetic Spectrum.

Although IR radiation cannot be seen with the human eye, it would be useful to imagine it as being visible when addressing the measurement principles and when factoring in applications, because in many respects IR radiation behaves in the same manner as visible light. IR energy always travels in straight lines from the source, and material surfaces in its path reflect and absorb it. With regard to most solid objects which are opaque to the human eye, part of the IR energy hitting the surface of the object is absorbed and some part is reflected.

Out of the energy absorbed by the object, a part is re-emitted and some part is reflected internally. The same rule also applies to materials which are transparent to the eye, such as thin, clear plastics, gases and glass but additionally some part of the IR energy will also travel through the object. The foregoing is shown in Figure 2. Together, these phenomena contribute to what is known as the Emissivity of the material or object.

Figure 2. Radiative Heat Exchange.

Blackbodies are those materials which do not transmit or reflect any IR energy and these materials do not exist naturally. However, for the sake of theoretical calculation, a value of 1.0 is assigned to a true blackbody. An IR opaque, spherical cavity with a small tubular entry (Figure 3) is the closest approximation to a blackbody emissivity of 1.0, which can be obtained in real life. The inner surface of such a sphere will possess an emissivity of 0.998.

Figure 3. Emissivity.

Emissivities are different for different kinds of gases and materials and hence these will emit IR at different intensities for a specified temperature. A material or gas’ emissivity is a function of its surface characteristics and molecular structure. It is not usually a function of color, unless the source of the color is an entirely different substance to the material’s main body. Metallic paints are a practical example of this, incorporating considerable amounts of aluminum. Regardless of color, most paints possess the same emissivity but aluminum has an entirely different emissivity which will thus alter the emissivity of metalized paints.

Similar to the case with visible light, if some surfaces are more highly polished, they will reflect more IR energy. Therefore, a material’s surface characteristics also impact its emissivity. In the context of temperature measurement, this is most important in the case of infrared opaque materials which possess an inherently low emissivity. Therefore, the emissivity of a highly polished piece of stainless steel will be much lower when compared to the same piece with a rough, machined surface. This is because the machining creates the grooves which prevent most of the IR energy from being reflected.

Besides the surface condition and molecular structure, the wavelength sensitivity of the sensor, called the sensor's spectral response, is the third factor influencing the apparent emissivity of a gas or material. As mentioned before, IR wavelengths between 0.7 microns and 20 microns are alone used for practical temperature measurement. Within this overall band, each sensor may operate in just a narrow part of the band, such as 4.8 to 5.2, or 0.78 to 1.06 microns, for reasons which will be explained later.

Theoretical Basis for IR Temperature Measurement

Infrared temperature measurement is based on formulas that are old, well proven and established. It is highly unlikely that most IRT users will have to use the formulas, but an understanding of these formulas will provide an appreciation of the interdependency of certain variables, and serve to explain the foregoing text. The key formulas are as follows:

  1. Stephan Boltzmann Law: The hotter an object becomes the more infrared energy it emits.
  2. Kirchoff's Law: When an object is at thermal equilibrium, the amount of absorption will equal the amount of emission.
  3. Wien's Displacement Law: As the temperature increases, the wavelength at which the maximum amount of energy is emitted becomes shorter.
  4. Planck's Equation: Describes the association between radiant energy, spectral emissivity and temperature.

Infrared Thermometer Design and Construction

A basic design of a infrared thermometer (IRT) consists of a lens to collect the energy produced by the target; an emissivity adjustment to correspond with the IR intemperature probe calibration to the emitting characteristics of the object being determined; a detector to change the energy to an electrical signal; and an ambient temperature compensation circuit to make sure that temperature differences within the infrared temperature sensor, caused by ambient changes, are not transmitted to the final output.

Over the years, this concept was followed by many commercially available IRTs and hence these had very limited applications, and in retrospect were not able to measure adequately in most conditions, although they were durable and sufficient for the standards of the time.

Figure 4. Infrared Temperature Measurement.

The modern IR temperature sensor is based on this concept, but is more technologically advanced to extend the scope of its application. The key differences are found in the selective filtering of the IR signal; use of a greater variety of detectors; provision of standard, final outputs such as 0-10 Vdc, 4-20 mA etc; and linearization and amplification of the detector output. A schematic representation of a typical modern IRT is illustrated in Figure 5.

The introduction of selective filtering of the incoming IR signal is perhaps the most major advancement in infrared thermometry and this has been made possible by the availability of more stable signal amplifiers and more sensitive detectors. While a broad spectral band of IR is required by the early IRTs to achieve a workable detector output, contemporary IRTs routinely have spectral responses of just 1 micron. Selected and narrow spectral responses are required because it is essential to see through some type of atmospheric or other interference in the sight path, or alternatively achieve a measurement of a gas or other substance which is transparent to a broad spectral band of IR energy.

Figure 5. Modern Infrared Thermometer.

Some common examples of selective spectral responses are 8-14 microns, which avoids interference from atmospheric moisture over long path measurements; 7.9 microns which is used for the measurement of some thin film plastics, and 3.86 microns which prevents interference from H2O and CO2 vapor in combustion gases and flames. The temperature range also dictates the choice between a longer or shorter wavelength spectral response, because according to Planck's Equation, the peak energy shifts towards shorter wavelengths as the temperature increases. This phenomenon is illustrated by the graph in Figure 6.

A narrow spectral response as close to 0.7 microns as possible could benefit applications which do not need selective filtering for the reasons stated above. This is because a material’s effective emissivity is highest at shorter wavelengths and the accuracy of sensors with narrow spectral responses is not considerably influenced by changes in target surface emissivity.

Figure 6. Radiant exitance as a function of wavelength and temperature.

It will be apparent from the foregoing data that emissivity plays a key role in infrared temperature measurement. Accurate data will not be obtained unless the emissivity of the material being measured is known, and incorporated into the measurement. Two methods are available for acquiring a material’s emissivity:

a) by referring to published tables and b) by comparing the measurement of the infrared temperature sensor with a concurrent measurement achieved by a resistance thermometer or thermocouple and tuning the emissivity setting until the same is read by the IRT.

Fortunately, the published data available from the infrared intemperature probe manufacturers and some research organizations is extensive, so it is seldom necessary to experiment. As a rule of thumb, a high and stable emissivity in the 0.85 to 9.0 range is exhibited in most opaque, non-metallic materials and a low to medium emissivity from 0.2 to 0.5 is exhibited by most un-oxidized, metallic materials, except for aluminum, gold and silver which have emissivities in the order of 0.02 to 0.04 and are thus difficult to determine with an IRT.

While the emissivity of the basic material being measured can almost always be established, a complication arises with regard to materials which have emissivities that change in response to temperature such as most metals, and other materials such as silicon and high purity, single crystal ceramics. Some applications exhibiting this phenomenon can be solved using the two color, ratio method.

Two Color-Ratio Thermometry

Since emissivity is an important factor in acquiring accurate temperature data from infrared thermometers, it is no surprise that efforts have been made to develop sensors which would measure separately of this variable. Two Color-Ratio Thermometer is the best known and most frequently applied of these designs.

Although this technique is similar to the infrared thermometers described thus far, it measures the ratio of infrared energy produced from the material at two wavelengths, instead of the absolute energy at one wave band or wavelength. While the use of the term "color" in this context is slightly outdated, it still has not been superseded. It stems from the old practice of associating visible color to temperature, and hence "color temperature."

Moreover, the basis for the effectiveness of two-color thermometry is that any modifications in either the sight path between the material and the sensor, or the emitting property of the material surface being measured, will be "seen" identically by the two detectors, and as a result the ratio and thus the output of the sensor will not change. A schematic representation of a simplified two-color thermometer is shown in Figure 7.

Figure 7. Two Color Thermometry (Ratio Thermometry).

Under prescribed circumstances, the ratio method will avoid inaccuracies that result from obscuration in the sight path, changing unknown emissivity and the measurement of objects which are not capable of filling the field of view. Hence it is helpful for solving some complex application issues. Among these are cement kiln burning zone temperature, the fast induction heating of metals and measurements via windows which become gradually obscured, for instance vacuum melting of metals. However, it must be remembered that these dynamic changes should be "seen" identically by the sensor at the two wavelengths employed for the ratio, and this is not always necessarily the case.

Further, the emissivity of all materials does not chnage equally at two different wavelengths and those materials that do are known as "Greybodies" and the ones that do not are known as "Non-Greybodies." In addition, all forms of sight path obscuration do not attenuate the ratio wavelengths equally. The prevalence of particulates in the sight path, whose micron size is same as one of the wavelengths being used, will apparently unbalance the ratio. Phenomena which are non-dynamic in nature, for example the "non-greybodyness" of a material can be addressed by biassing the ratio - an adjustment called "Slope."

Yet, the appropriate slope setting must be reached experimentally. In spite of these limitations, the ratio method works suitably well in several well established applications, and in others is the best, if not the most desired solution.

Summary

As a mature but dynamic technology, the infrared temperature sensor has gained the respect of many institutions and industries. It is an essential method for many temperature measurement applications, and is the method of choice for some other applications. Once the user has sufficiently understood the technology and has properly considered all the relevant application parameters, a successful application can be realized, but only when the equipment is carefully installed.

Meticulous installation means the sensor operation is ensured within its given environmental limits, and that sufficient measures are taken to keep the optics clean and free from obstructions. When choosing a manufacturer, a factor in the selection process should be the availability of installation and protective accessories as well as the extent to which these accessories enable the sensor to be removed and replaced quickly for maintenance purposes. By following these guidelines, the contemporary infrared thermometer can be operated in a more reliable way when compared to resistance thermometers or thermocouples in many cases.

This information has been sourced, reviewed and adapted from materials provided by Omega Engineering Ltd.

For more information on this source, please visit Omega Engineering Ltd.

Tell Us What You Think

Do you have a review, update or anything you would like to add to this article?

Leave your feedback
Submit