Everyone has an intuitive understanding of hot and cold as sensations, but defining and measuring temperature with thermometers has been a far more difficult intellectual challenge. Historians are uncertain if the ancient Greeks or Chinese had temperature measurements, so we’ll begin this history during the Renaissance.
The Temperature Measurement Challenge
Heat, or thermal energy, is the amount of energy that’s contained in a body or substance; but unlike other physical properties like mass or length, it has proved difficult to reliably measure. The general method for temperature sensing is indirect, observing the effect that heat has on something as it comes into thermal equilibrium – and deducing the temperature from this effect, like the volumetric expansion of mercury at with temperature.
Scales of measurement for temperature have also led to compensations. Robert Hooke in 1664 was the first to propose using the freezing point of water as a universal zero calibration point. Ole Roemer, his contemporary, realized that two fixed points were needed to allow interpolation of temperatures between those points; he chose Hooke’s freezing point and also the boiling point of water. However, this temperature scale was imperfect, because it was clear that things could go below freezing and above boiling.
Gay-Lussac and other empirical scientists worked on the ideal gas laws and explored a greater range of temperatures during the 19th century. While investigating the effect of temperature on gas at a constant pressure, they noted that volume rises by a specific fraction of its total volume per degree Celsius. Since this occurs at 1/273.15 per degree Celsius, it was determined that there was an absolute zero of temperature at minus 273.15°C.
Temperature sensors based on Liquids and Bimetals
Alongside being a pioneer in astronomy, Galileo Galilei was considered one of the earliest thermometer builders from around 1592. It seems to have operated using the contraction of air in a vessel – which contracts when temperatures cool. This then results in suction which draws up a column of water. The height of the column is then related to the extent of cooling; but with no fixed scale and big errors introduced by varying air pressure, this was little more than a novelty.
Santorio Santorii in modern-day Italy sealed liquid inside a glass tube, observing how it moved up the tube as it expanded; the scale on the tube made it easier to determine the changes, but there were no precise units. Invented in 1612, this device was arguably the first modern thermometer.
Santorii worked with Roemer and Fahrenheit, and they began manufacturing thermometers. The liquid which expanded to determine the temperature of the substance was varied. Mercury is ideal – its response in volume expansion is linear across a very wide range of temperatures, so you can draw a consistent scale, but it is toxic to humans. Alcohol was also briefly used in early thermometers. In modern thermometers, the liquid medium has often been replaced by a non-toxic alternative; controlling the depth at which the bulb is immersed is important to prevent pressure impacts from influencing the temperature measurement. Thermowells can be used to ensure good heat transfer to the bulb.
In the 19th century, the new innovation was a bimetallic temperature sensor, which uses two metal strips bonded together; they expand at differential rates in different temperatures. When the temperature changes, the metals bend which can activate a thermostat or a gauge similar to those used in gas grills. The accuracy of these sensors is very low – they can only measure to within a couple of degrees - but given the inexpensive nature of bimetallic sensors, they have a range of applications.
Burial Plaque of Daniel Gabriel Fahrenheit
Examples of sensors based on liquids and bimetals
Generally the classic fluid-expansion thermometer is classified as either a mercury-type or an organic-liquid type: but some gas versions are used for specific applications. Despite being 17th century technology, they still have some advantages; there is no requirement for electrical power, and they’re stable across a huge range of different cycles, as well as being resistant to explosions.
Bimetallic Measurement Devices
One side of the bimetallic thermometer expands at a different rate; this differential bending can be translated into temperature reading, as it’s mechanically linked to a pointer.
The electrical revolution in the 19th century was a time of great development, and it was discovered that metals varied in resistance and electrical conductivity. In 1821, Thomas Johann Seebeck discovered that a voltage is created when the ends of different metals are joined together but held at different temperatures, suggesting the possibility of a thermometer based on measuring the voltage across the gap. Peltier discovered that this thermocouple effect is reversible and can be used for cooling.
Humphrey Davey was able to demonstrate the functional dependence of electrical resistance on temperature – for substances known as thermistors. Five years later, Becquerel proposed using a platinum-platinum thermocouple for temperature measurement, but it took until 1829 for Leopoldi Nobili to create a working device.
CH Meyers in 1932 invented a resistance temperature sensor that also used platinum; it measures the resistance of a platinum wire, and is generally considered to be one of the most accurate temperature measurers. Unfortunately, RTDs using wire are fragile and unsuitable for industrial applications; hence the development in recent years of film-based versions, which are less accurate but more robust.
The 20th century also saw the invention of semiconductor temperature measuring devices. These are accurate, but, until recently, non-linear – which meant complicated calibration curves and potential for error across a range of temperatures.
Examples of sensors based on thermoelectric effect
Thermocouples use two strips or wires; different metals, joined at a single end. Changes in the temperature induce a change in electromotive force or voltage between the other ends.
RTDs measure the change in resistance of a metal which can rise more or less linearly with temperature. They are among the most accurate temperature sensors and are broadly used in industry and academia.
Thermistor probes are thermistor elements embedded in metal tubes. The non-linear thermistors mean that the instrument must calibrate the effect to a normal temperature scale.
All objects that have a temperature glow, giving off heat and electromagnetic radiation; we are familiar with this when they glow white-hot, but it occurs in infrared at longer wavelengths for lower temperatures. English astronomer William Herschel was the first to recognize, around 1800, that this “dark” or infrared light causes heating. Working with his compatriot Melloni, Nobili found a way to detect this radiated energy by connecting thermocouples in a series to make a thermopile.
This led Samuel Langley to develop the bolometer, which measures the temperature of objects based on their electromagnetic radiation. A bolometer uses two platinum strips, one of which was blackened in a Wheatstone bridge arrangement. Heating by infrared radiation caused a measurable change in resistance, and the Wheatstone bridge allows even small changes in resistance to be detected in an electrical circuit.
Bolometers are sensitive to infrared light across a wide range of wavelengths. Photon detector-type devices have been developed since the 1940s but they tend to respond to limited wavelength bands; for example, lead sulphide detectors are sensitive to wavelengths up to 3 microns. The discovery of HgCdTe alloy opened the door to detectors that can be tailored to specific wavelength-bands.
Today, inexpensive infrared pyrometers are used widely, and thermal cameras are finding more applications as their prices drop.
Examples of sensors based on thermal radiation
Infrared Temperature sensors
Infrared sensors are non-contacting devices. They convert the energy to an electrical signal that can be displayed in units of temperature after being compensated for ambient temperature variation.
Fahrenheit invented his own temperature scale for his thermometers based on the temperatures he could easily obtain: he set the freezing point of salt water at 30 degrees and its boiling point 180 degrees higher. Later, the salt water was replaced by pure water – easier to obtain and define – but this freezes at a slightly higher temperature, giving us freezing at 32 °F and boiling at 212 °F.
Anders Celsius proposed the 0 to 100 scale based on water’s freezing and boiling points; William Thomson, later Lord Kelvin, used absolute zero as the starting point of his refined Celsius system. This leads to the most scientific of the temperature scales – the Kelvin scale, which is universally used in scientific fields.
Today, temperature measurement scales are defined in a document titled International Temperature System 90, or ITS-90 for short.
This information has been sourced, reviewed and adapted from materials provided by OMEGA Engineering Ltd.
For more information on this source, please visit OMEGA Engineering Ltd.