There is a lot of hype around the promise of fully autonomous vehicles, and that hype is justified. Today, a lot of municipalities, states and cities around the United States and the globe are experimenting with self-driving cars, including permitting self-driving vehicles on public roadways for package deliveries to residences and passenger rides.
Although they are exciting, these experiments are occurring within limited environments and situations, and the ability to make these vehicles truly autonomous anywhere, anytime, is still a huge technical challenge.
This does not include the societal and political challenges that are pending, which public officials and governments are only just starting to address as autonomous vehicle tests grow both in numbers and scope on public roads. For example, locations which do not possess lanes or include uncommon infrastructure and environmental features or unmapped areas, still present a significant challenge to autonomous vehicle engineers and developers.
Systems that can work in unrestricted environments must be built, including on unpaved roads. Autonomous vehicles of the future must be able to establish, not just predict, which places are appropriate for moving vehicles, night or day, and in any weather condition.
Leveraging a Variety of Sensor Types
To achieve this technical feat, autonomous vehicles need to leverage a wide scope of sensor types fused together to supply complete situational awareness, including LIDAR, sonar, visible cameras, radar, and thermal imaging.
This is in addition to accessing data outside of the car, including vehicle-to-everything (V2X) sensors and GPS. Out of these sensors, thermal imaging is especially helpful for aiding a vehicle to “see” in challenging lighting conditions and weather, where other sensors may fail.
Thermal sensors add reliability and improve performance of the ADAS and AV sensor suites.
Observing Heat and Avoiding Collisions
Compared to other sensors, thermal cameras identify and quantify a completely different wavelength of energy known as long-wave infrared (LWIR) radiation, or heat energy.
This section of the electromagnetic spectrum absorbs, radiates, or is reflected by everything on Earth. Thermal cameras detect heat just as well in total darkness, daylight, or in blinding sun glare. They also work a lot better than visible cameras in smoke or inclement weather such as fog.
Crucially, thermal cameras are especially proficient at detecting body heat, particularly distinguishing pedestrians far down dark country roads or in busy urban backgrounds.
Thermal imaging is also a passive technology. It does not send out and receive signals for object detection, unlike LIDAR, which can be influenced by adverse weather conditions and cluttered environments (including multiple-LIDAR-system interference in a confined area) and that can decrease the number of pixels on a given target.
By utilizing thermal cameras as a stereo pair, it can create a three-dimensional (3D) awareness of its surrounding environment and can also act as a redundant and complementary system to LIDAR.
Within this context, in addition to establishing the shape and distance of objects near the vehicle, thermal can also measure the heat radiation of every object, enabling the system to both characterize and confirm if an object identified is a living thing.
This is extremely helpful as living things, such as people, and other large mammals, including elk and deer, are the type of objects passengers do not want to hit.
In this visible (top) and thermal (bottom) comparison, the glare of the streetlights and the nighttime fog make the pedestrian all but invisible in visible light but easily distinguished in thermal.
How Thermal Stereo Vision Works
Thermal stereo vision works in a similar way that human vision does, in that it is based on triangulation of rays. In this case, thermal rays, from two or even more viewpoints, supply depth perception by computing distance to different objects in a given scene.
This is attained by identifying corresponding pixels between the thermal stereo pair and triangulating the distance measurements via image processing algorithms. Thermal stereo vision permits 3D perception under any weather and lighting conditions.
Already, thermal stereo cameras are under development in the marketplace and can act as a valuable tool for autonomous and unmanned boats, aircraft, and land-based vehicles.
More specifically, companies such as Foresight have already developed autonomous vehicle vision systems that rely on thermal and visible stereo cameras to enable enhanced situational awareness.
3D location of an object is obtained by finding corresponding pixels between the stereo pair and triangulation.
Disparity refers to the difference in image location of an object observed in the stereo pair and a closer object has a larger disparity.
A sample thermal image from a stereo pair for a test scene (left) and the corresponding disparity image where the orange intensity is proportional to the disparity (middle). A topdown view of the resulting 3D point cloud shows the people at different distances (right).
Thermal stereo cameras are valuable for partial autonomous systems that exist already, by making them safer and more reliable, for example, automatic emergency braking (AEB).
Theoretically, upon detecting an object in the road in any lighting condition, the thermal stereo camera could tell the vehicle’s computer system to slow down. It is also able to supply redundant distance data, along with the vehicle’s radar and visible camera systems, so the vehicle performs the most appropriate action for the situation.
Many challenges remain even though autonomous carmakers, engineers, entrepreneurs, and developers have made remarkable progress in the creation of true self-driving cars.
These challenges are not insurmountable. By continuing to iterate, test and validate, including implementing emerging technologies like thermal stereo vision, it is only a matter of time before the world’s first fully autonomous vehicle is a reality.
This information has been sourced, reviewed and adapted from materials provided by FLIR.
For more information on this source, please visit FLIR.