Editorial Feature

TriEye's Infrared Sensing for Increased Driver Visibility

Image Credit: Andrey Suslov / Shutterstock.com

Studies have shown that people are more likely to crash their cars during times of low visibility, such as darkness and poor weather. Human error is considered to be the main cause of traffic crashes, and in situations of low visibility, it would seem that errors become more prevalent.

Until now, technology has been fairly limited in its ability to address the problem of driving in low visibility.

Low Visibility a Major Cause of Car Accidents

Each year in the US, around 1.1 million car crashes are caused because of wet roads, claiming the lives of over 5,000 people. In addition, there are roughly 225,000 car accidents as a result of driving in snowy road conditions.

The statistics demonstrate the severity of the danger posed by driving in low visibility conditions such as heavy rain, fog, snow, and darkness. While many strategies and technologies have been developed to protect people from a range of human errors that contribute to traffic crashes, such as distracted or impaired driving, as well as slowing reflexes that come with age, low visibility is one limitation that has received less attention.

Driving at night is considered to be one of the most difficult driving scenarios. Year on year statistics reveal that more people die in car crashes at night than they do in the day. Also, even though many factors are involved in the causes of collisions involving cars and cyclists, those that happen at night are thought to primarily occur because of low visibility conditions.

Studies have shown that fog or smoke-related crashes are most prevalent during the early morning hours of the months of December through to January. Scientists have linked this to the low light of winter, causing poor visibility.

One particular study found that fog and smoke-related crashes are more likely to involve more vehicles and result in severe injuries. The study also found that fog and smoke-related crashes are at their most prevalent in the hours of darkness.

Short-Wave Infrared Sensing Chips for Improved Driver Visibility

A solution that could reduce errors and improve visibility could potentially save the lives of thousands each year. This is where the Israeli startup TriEye has stepped in. The company was founded in 2017 by Avi Bakal, Omer Kapach, and Prof. Uriel Levy following a number of years at the Hebrew University in Jerusalem where Levy led advanced nanophotonics research.

TriEye develops short-wave infrared (SWIR) sensing chips to enhance visibility in adverse driving conditions. The company believes that their innovation will be able to make low visibility driving safer, using sensing technology to enhance driving ability and reduce fatal errors. It was announced early this year that car manufacturer Porsche is teaming up with the technology startup to not only test but improve TriEye’s products.

Click here for more information on sensors technology.

The two companies will work together to optimize the technology that TriEye has developed, which is suitable for use in cars with advanced driver assistance systems (ADAS) and autonomous vehicles (AV).

It is hoped that this collaboration will be successful in developing a product that will significantly reduce road traffic accidents caused by driving in low visibility.

TriEye’s Infrared Sensing Technology

The technology being produced by TriEye is far more advanced than alternative solutions currently available.

TriEye technology utilizes a very small high-definition SWIR camera that is marginally larger than a penny. The camera is more cost-effective and produces images in a much higher resolution than alternative technologies.

Tests have already established that TriEye is effective and suitable for mass production, offering an accessible solution to a vast number of drivers. The effectiveness and accessibility of the product give it the potential to make a huge impact on improving driving conditions in low visibility, potentially saving thousands of lives.

In integrating TriEye’s camera with advanced sensing technology with ADAS technology and AVs, the driver can access high-resolution images of their surroundings in low visibility conditions such as during snow, fog, heavy rain, dust, or darkness.

The demand for ADAS systems is rapidly increasing. In 2019, the market was worth a huge $30 billion, a figure that is expected to continue growing at CAGR of 20.7%, reaching a value of $134.9 billion by 2027.

Click here to find out more about infrared sensor equipment

With more cars being added to the roads each year with ADAS systems incorporated, car manufacturers are becoming increasingly aware of the need to integrate advanced sensing technology into their vehicles to allow them to function optimally in a range of scenarios. While currently available systems may combine radar, lidar, and cameras, these systems are not fully functional in low-visibility conditions.

TriEye have solved this challenge with their complementary metal-oxide-semiconductor (CMOS) based SWIR camera that provides better visibility in adverse conditions. The camera is equipped with a number of capabilities such as imaging in adverse weather and night conditions, as well as remote sensing. The technology also utilizes AI algorithms, which seamlessly integrate with the car’s technology and uses an optimal mounting position.

Known as the TriEye Raven, the Israeli start-up’s infrared sensing technology is making a future possible where low-visibility need not increase a driver’s risk of a crash. The breakthrough that the TriEye team achieved was in using a CMOS-based sensor to enable short wave infrared capabilities.

The technology has won two CES 2020 Innovation Awards in the categories of Embedded Technologies as well as Vehicle Intelligence and Transportation.

TriEye's SWIR Imaging Solution

Image Credit: TriEye Technologies/YouTube.com

Future Directions

TriEye is causing a shift in the automotive industry by moving away from the currently available systems that rely on lidar, radar, and standard vision cameras.

While SWIR cameras have long been used by the defense and aerospace industries, up until this point it has been too expensive to realistically incorporate them into vehicles for the mass market. TriEye has made this possible, and will potentially significantly reduce car crashes and deaths.

The future will likely see widespread adoption of TriEye's new technology, along with continuing development of its capabilities. In addition, the creators believe the technology has a future in applications outside of the automotive industry, such as in robotics, industry, agriculture, drones, maritime, and more.

References and Further Reading

ADAS Market by System (ACC, AFL, DMS, NVS, IPA, PDS, TJA, FCW, CTA, RSR, LDWS, AEB, & BSD), Component (Radar, LiDAR, Ultrasonic, and Camera Unit), Vehicle (PC, LCV, Buses, & Trucks), Offering (Hardware, Software), EV and Region - Global Forecast to 2027. Markets and Markets. https://www.marketsandmarkets.com/Market-Reports/driver-assistance-systems-market-1201.html

Technology. TriEye. https://trieye.tech/technology/

Porsche looks to Israeli startup to bring better visibility to drivers. The Times of Israel. https://www.timesofisrael.com/porsche-looks-to-israeli-startup-to-bring-better-visibility-to-drivers/

Porsche invests in ‘low visibility’ sensor startup TriEye. Tech Crunch. Kristen Korosec. https://techcrunch.com/2019/08/21/porsche-invests-in-low-visibility-sensor-startup-trieye/

Nighttime driving: visual, lighting and visibility challenges. Wiley Online Library. Joanne M Wood. https://onlinelibrary.wiley.com/doi/full/10.1111/opo.12659

Association of reduced visibility with crash outcomes. Science Direct. Subasish Das. https://www.sciencedirect.com/science/article/pii/S0386111216300681

A study on crashes related to visibility obstruction due to fog and smoke. ResearchGate. Mohamed Abdel-Aty. https://www.researchgate.net/publication/51206008_A_study_on_crashes_related_to_visibility_obstruction_due_to_fog_and_smoke

TriEye Wins CES 2020 Innovation Awards in Two Categories. PR Newswire. https://www.prnewswire.com/il/news-releases/trieye-wins-ces-2020-innovation-awards-in-two-categories-300953957.html

Disclaimer: The views expressed here are those of the author expressed in their private capacity and do not necessarily represent the views of AZoM.com Limited T/A AZoNetwork the owner and operator of this website. This disclaimer forms part of the Terms and conditions of use of this website.

Sarah Moore

Written by

Sarah Moore

After studying Psychology and then Neuroscience, Sarah quickly found her enjoyment for researching and writing research papers; turning to a passion to connect ideas with people through writing.


Please use one of the following formats to cite this article in your essay, paper or report:

  • APA

    TriEye. (2020, June 04). TriEye's Infrared Sensing for Increased Driver Visibility. AZoSensors. Retrieved on May 20, 2024 from https://www.azosensors.com/article.aspx?ArticleID=1952.

  • MLA

    TriEye. "TriEye's Infrared Sensing for Increased Driver Visibility". AZoSensors. 20 May 2024. <https://www.azosensors.com/article.aspx?ArticleID=1952>.

  • Chicago

    TriEye. "TriEye's Infrared Sensing for Increased Driver Visibility". AZoSensors. https://www.azosensors.com/article.aspx?ArticleID=1952. (accessed May 20, 2024).

  • Harvard

    TriEye. 2020. TriEye's Infrared Sensing for Increased Driver Visibility. AZoSensors, viewed 20 May 2024, https://www.azosensors.com/article.aspx?ArticleID=1952.

Tell Us What You Think

Do you have a review, update or anything you would like to add to this article?

Leave your feedback
Your comment type

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.