Safety Challenges that Require Real Technology Solutions
Up until now, SAE automation level 2 (partial automation) and level 3 (conditional automation) vehicles did not incorporate infrared (IR) or thermal imaging in the sensor suite. Tragic, high-profile accidents which involved both Tesla and Uber vehicles mean that sensor performance and safety is now scrutinized much more closely. Whilst many test vehicles perform well under test and ideal conditions, their actual performance must be consistent under real-life driving conditions.
Figure 1. Thermal imagers use infrared energy to detect, classify, and measure temperature from a distance.
Thermal sensors can detect and classify people and animals in darkness and through most fog and sun glare at lengths more than four times the illumination distance of typical headlights.
Developers are using the opportunity to integrate FLIR’s automotive development kit, the FLIR ADK™, into vehicles to add thermal imaging to their sensor suite. Thermal sensors perform well in conditions where other technologies in the sensor suite are challenged.
Sensing Minute Differences in Temperature
It is a well known fact that thermal cameras can clearly identify differences between inanimate objects, a human body (living things), and background clutter, differentiating them as an essential technology to detect pedestrians. Thermal or longwave infrared (LWIR) energy is transmitted, emitted, or reflected, by everything that would be on or close to a roadway.
FLIR thermal imaging cameras are exceptionally sensitive to differences in temperature as minute as 0.05° Celsius. VGA thermal cameras (640 x 512 pixels) can clearly show almost everything in a scene, including even the centerline on a roadway, by utilizing this precise sensitivity.
A screen grab of video from a FLIR recreation of the Uber accident in Tempe, Arizona (Figure 2) clearly reveals roadway surface details, like paint, whilst observing and classifying the pedestrian at more than double the necessary “fast-reaction” stopping distance for a human driving at 43 mph1 (126 feet or 38.4 meters).
Figure 2. The FLIR ADK with VGA resolution can “see” precise details including roadway surface markings – day and night.
“Seeing” Heat Through Fog Instead of Relying on Light
The 2016 AWARE (All Weather All Roads Enhanced) vision project assessed a camera suite that has the possibility of improving vision in challenging-visibility conditions, such as fog, night, snow, and rain. To distinguish the technologies which produced the best vision in all-weather, the four different bands on the electromagnetic spectrum were tested:
Figure 3. Example images recorded in fog tunnel with thermal (LWIR), visible RGB, short-wave (SWIR), and near (NIR) cameras.2 Copyright SIA Vision 2016.
- Short-wave infrared (SWIR)
- LWIR, or thermal
- Visible RGB
- Near infrared (NIR)
The project calculated pedestrian identification at different fog densities (Table 1) and formulated the following three conclusions.2
- The LWIR camera was the only sensor that detected pedestrians in complete darkness. The LWIR camera was also more resilient to glare produced by oncoming headlamps in the fog.
- NIR, Visible RGB, and SWIR cameras sometimes missed a pedestrian because they were hidden by headlamp glare.
- The visible camera had the lowest fog piercing capability. The LWIR camera penetrated fog better than the NIR and SWIR.
On Detection, Classification, and Fields of View
Figure 4. Thermal cameras require only 20 by 8 pixels to reliably classify an object.Figure 5. The narrower the horizontal FOV, the farther a thermal camera can “see.”
The main performance metrics within advanced driver assist system (ADAS) and AV sensor suites are detection and classification. Detection tells a system that there is an object ahead. Classification decides the class of object (person, bicycle, car, dog, other vehicle, etc.) and advises the classification confidence level.
In thermal and photography imagers, the field of view (FOV) is the section of a scene that is visible through the camera at a certain orientation and position. The narrower the FOV, the farther a camera can observe. A wider FOV provides a greater angle of view, but cannot see as far.
The distance a thermal camera can detect and classify an object is impacted by FOV, meaning a number of cameras could be necessary; a wide FOV sensor for optimal use in city driving, and a narrow FOV sensor to see far ahead of the vehicle on a rural highway.
It is normally required by current artificial-intelligence-based classification systems, a target to fill 20 by 8 pixels to reliably (>90% confidence) classify a given object. So, to classify a human with reliable confidence, the human has to be around 20 pixels high as shown in Figure 4. Table 2 comprises of classification distances for various thermal camera horizontal fields of view and shows that a FLIR ADK can classify a 6-foot tall human at a distance farther than 600 feet (186 meters) for a narrow FOV lens configuration. Detection, which needs fewer pixels on an object, means that a 6-foot tall human can be detected at over 200 meters using the FLIR ADK.
Better Situational Awareness Results in More Informed Descisions
Thermal imaging technology is a passive imaging technology which is highly sensitive. It can be a key enabler for safer AV and ADAS utilization. Thermal sensors can detect and classify people and animals through sun glare, in darkness and through most fog at distances over four times farther than visible cameras can see or typical headlights illuminate.
FLIR thermal cameras complement existing technologies in the sensor suite and help these systems make better, safer decisions which are based on improved situational awareness.
The images displayed may not be representative of the actual resolution of the camera shown. Images for illustrative purposes only.
2 Nicolas Pinchon, M Ibn-Khedher, Olivier Cassignol, A Nicolas, Frédéric Bernardin, et al.. All-weather vision for automotive safety: which spectral band?. SIA Vision 2016 - International Conference Night Drive Tests and Exhibition, Oct 2016, Paris, France. Société des Ingénieurs de l’Automobile - SIA, SIA Vision 2016 - International Conference Night Drive Tests and Exhibition, 7p, 2016. <hal-01406023>
This information has been sourced, reviewed and adapted from materials provided by FLIR Cores and Components Group.
For more information on this source, please visit FLIR Cores and Components Group.