ADAS Sensing – Using Thermal Sensing in ADAS Systems

Advanced Driver Assist Systems (ADAS) could transform the way we travel and transport goods by road, whilst simultaneously improving safety. ADAS can allow cars to move more efficiently and provide greater freedom of movement to all age groups. At the same time, ADAS can also severely reduce accident rates by eliminating the ever-growing threat of distracted driving and human error.

Currently, humans are the decision makers behind the wheel of a car, but with ADAS they may soon be replaced. A suite of sensors may substitute a human's ability to sense, replacing and even augmenting our present capability to drive in all environments and in all types of weather.

While sensors continue to advance considerably, there is not yet a single sensor that can enable safe driving. Therefore, an array of orthogonal and complimentary sensors are selected for ADAS systems, which collectively provide important information and redundancy to ensure continuous SAFETY and improve driving performance across any conditions.

The standard sensor suite contains ultrasound, LiDAR, Radar, and visible cameras. This article discusses the value of integrating thermal cameras to the sensor suite because of their unique strength. i.e., detecting and classifying living things, including humans, in the cluttered driving environment.

Why Multiple Sensors in a Typical ADAS Environment?

Driving conditions for ADAS can differ considerably and the selected set of sensors must efficiently work in rural, city, and highway environments, and at the same time must perform equally well in any weather conditions, whether day or night. This is difficult as different challenges and requirements are associated with each different driving environment.

For autonomous driving, a set of sensors that detect dangers reliably is required, and there must also be built in redundancy in the event of one sensors failure. There is no single technology that represents all aspects required for fully autonomous driving. Each sensor can be “fused” for decision making or used separately for detection within vehicle central computer or a central electronics control unit (ECU). Long-Range Radar, short-medium radar, LIDAR, ultrasound, and visible cameras (Figure 1) can all form the elements of a typical sensor suite.

Figure 1. Typical sensor suite plus FLIR Thermal IR imaging

While several methods can be used by autonomous driving systems, the main approach used is to detect and then classify objects so that a course of action can be established for the vehicle to take. Objects in or around the road can be reliably detected by LiDAR and Radar systems, however a camera is still be required to classify the detected objects to give confidence in situational awareness.

Both LiDAR and Radar systems create a point-density cloud from the reflections they gather, and then measure the range and closing speed of an object; however, these systems cannot easily detect objects because they have a lower resolution when compared to cameras

In view of this lower resolution and also to optimize the detection at varying ranges, manufacturers are likely to install a number of units, that is, a medium-range unit for emergency brake assist and long-range radar for adaptive cruise control. Although LiDAR is similar to radar, at longer ranges it has a lower point-density.

Figure 2. Data from typical Radar system (R)

Figure 3. LiDAR can detect objects, but not classify them. Thermal cameras provide the data ADAS-equipped cars need to take the right actions

As shown in Figure 2, the radar system identifies objects that are close to the car (~15 m away) as well as objects located at a distance of 150 m. However, in order to determine objects such as a motorcycle or truck, the radar system has to cross-reference the radar data with the data provided by an additional sensor such as a visible or thermal or visible camera. Moreover, the signal received from the large truck almost blocks the signal received from the pedestrian. If a pedestrian is crossing between two cars, potential risks cannot be identified due to little to no reflected signal.

Increased informational density is provided by LiDAR systems, particularly when objects are close to the car itself (Figure 1). However, these systems cannot easily classify objects that are further away or when the car is moving at higher speeds. While these expensive, rotating units provide increased data rates, their long-term reliability is yet to be to proved.

While increased robustness is provided by solid state units at a potentially lower cost, they have fewer points on target, particularly when viewing objects from farther away. In order to create the amount of data required to classify the object in a reliable (redundant) and cost effective solution, LiDAR or Radar may need to be combined with the output obtained from a visible camera for most daylight conditions, and a thermal camera for challenging daylight and nighttime conditions.

Download the White Paper for More Information

Working together, these systems detect an object by the side of the road but as observed, classification is also an important part of any detection capability as it could only be a small, stationary tree - or a person that may certainly start moving. Classification becomes more difficult with nighttime driving, poor lighting conditions, and weather which are the major strengths for thermal cameras. The uses and strengths for each sensor type are shown in Table 1.

Table 1. Detector Technologies and Application Summary

Benefits of Thermal in ADAS

While LiDAR and Radar systems are used for range, object detection, and mapping applications, a camera is required for red light detection, object classification, and speed signs. Driving systems and situational awareness can be increased by adding a thermal camera to a suite of ADAS sensors.

Figure 4. Thermal imagers let ADAS systems detect people, animals, and vehicles through dust, smog, and heavy smoke

Figure 5. ADAS systems with thermal imagers allow the vehicle to be aware of people outside the cone of its headlights at night.

Figure 6. Thermal imagers aren’t effected by sun glare or low sun angles, so they can still detect pedestrians when visible cameras would be blinded.

Key Benefits of Thermal

  • See clearly even in difficult lighting conditions – unaffected by smoke, darkness, sunlight glare
  • Ability to clearly see animals and people even in a cluttered environment – hot objects stand out
  • When used with a visible camera, there is redundancy and increased confidence in detection/classification – orthogonal detection
  • Reliability – proven by more than a decade of use in the automotive industry

Thermal sensors or cameras are capable of detecting the differences in the relative intensities of the infrared energy being reflected or emitted from an object. Infrared energy has nothing to do with the amount of visible light available. Thermal imaging is not just another kind of night vision, but more than that – it sees heat, not light, and it does this consistently over 24 hours a day. Anything that produces heat can be seen with thermal imaging. Animals and people have individual heat signatures that can be viewed only with a thermal camera.

The law enforcement and military applications of infrared imaging have been known and proven for many years. Since thermal cameras can see through smoke, they have become an established tool for search and rescue and firefighting operations. Since the ability of thermal cameras to see clearly is not connected to the amount of light available, they can see clearly even when looking directly into the sun. Even when the sun is at a blinding angle or if there is smog or thick smoke in the air, a thermal camera will still be able to distinguish items clearly.

With recent technological developments, thermal imagers have become lighter, smaller, and more affordable for both consumer and commercial applications. This paves the way for a whole new range of applications including ADAS, as thermal imagers can easily detect the differences in heat that give away the presence of animals or people, seeing in complete darkness or seeing through dust and smoke.

Download the White Paper for More Information

Thermal Detects People in a Wide Variety of Lighting Conditions

In order to successfully implement ADAS, it is essential to be aware of each vehicle’s direction, position, and speed in relation to other traffic. These factors can differ considerably based on the type of driving being done, and on the particular roadway. In all driving conditions, the power of thermal sensors to see heat coupled with simple object-recognition algorithms, makes it exclusively qualified for detecting and classifying animals and people from cluttered environments.

Highway driving can include stop-and-go-driving or high speeds, so ADAS systems should be capable of detecting vehicles, other objects, and hazards from both long and short ranges. Rural driving can involve lower speeds, but the bigger possibility of animals and several other natural hazards on the roadway.

There are more than one million deer strikes in the US each year, costing several billion dollars in damage. Driving at night in rural areas is greatly dependent on headlights since roads are usually poorly lit. Roads are also not as certainly straight as highways and thus the ability to have wide situational awareness in poor lighting conditions is indeed a challenge.

City driving can be considered the most challenging as it needs awareness of the cars and also items such as pedestrians, bicyclists, and traffic lights. Lighting conditions can also influence the potential to detect hazards. The oblique angle of the sun and the resulting shadows may efficiently hide pedestrians during the day, while night driving into oncoming headlights can also hinder the detection of road hazards and pedestrians.

System Redundancy

Redundancy is a key requirement while selecting the suite of sensors to be used in ADAS applications. While each sensor type brings its own weaknesses and strengths, the manner in which they complement each other – emphasizing strengths and compensating for weaknesses in specific conditions – enables designers to develop a system that is adequately robust to provide the complete situational awareness needed for autonomous driving.

Based on their very natures, many of the sensors in an ADAS system produce system redundancy simply because they work in varied areas of the electromagnetic spectrum. The Radar system should be available as a backup system if the LIDAR system has dirt or mud blocking the receiving unit. If the lighting conditions (nighttime, glare) cause problems for the visible camera, then it is possible to use the thermal camera as the backup system. This establishes a truly redundant ADAS system.

Figure 7. Thermal sensors are not blinded by oncoming headlights and can detect and classify people and animals in total darkness

System Integration - Cost, Reliability and Packaging

Broader adoption of autonomous driving solutions need a technologically reliable solution and also a solution that is inexpensive and capable of being easily incorporated into exisiting car designs. Radar, Ultrasonic, and Visible cameras are already being used to a great extent; these technologies are trustworthy and their economies of scale have further made them affordable. It is also possible to place them behind the windscreen or within the bumpers of cars in order to obtain sophisticated packaging solutions.

In the case of LiDAR, the existing systems are usually not very reliable, are costly, and are problematic to integrate elegantly into a car. Because of the terrific potential of LiDAR in ADAS applications, there are a number of ongoing efforts to create smaller solid-state units that should result in increasing reliability and lowering costs.

Thermal cameras have been employed in automotive night vision applications for animal and pedestrian detection for more than a decade. These cameras have proven extremely reliable, but the cost of the system has relegated them to use in luxury models and brands because of the size of the detector and the relatively low volume of units produced. Latest technological developments have driven down the size and cost of the technology. One such example is the new 12 µm sensor from FLIR that is employed in the new Boson camera. Besides being cost-effective, it also allows for elegant packaging within the car.

Download the White Paper for More Information

Conclusion

Autonomous driving systems need to install a sensor suite that is robust, high performance, and redundant enough to be able to deliver safety in all types of driving environments. The primary shortfall of current systems is the potential to detect and classify animals and people in challenging lighting conditions.

Thermal imagers are proven technology, and have enabled drivers to see up to four times farther than their high beams for over 10 years –night, day, through haze and smoke, and past the glare of oncoming headlights.

This same thermal imaging technology is considered to be the best sensor solution for detecting cyclists, pedestrians, and animals in cluttered environments, thus providing ADAS integrations the critical information they needed in order to make automated, accurate decisions. Additionally, their latest reductions in cost and size make them essential components in future ADAS systems.

This information has been sourced, reviewed and adapted from materials provided by FLIR Systems.

For more information on this source, please visit FLIR Systems.

Ask A Question

Do you have a question you'd like to ask regarding this article?

Leave your feedback
Submit