Posted in | News | Light / Image Sensor

Robotic Navigation Cane with 3D Camera Helps Visually Impaired

Researchers at Virginia Commonwealth University have developed a robotic cane with a 3D visual positioning system and camera that offers new hope to visually impaired and blind people.

Robotic Navigation Cane with 3D Camera Helps Visually Impaired.

Image Credit: Shutterstock.com/ Tracy Spohn

A robotic cane mounted with an on-board computer, 3D camera and inertial positioning sensor, opens new avenues for indoor navigation to the blind and visually impaired.

The on-board computer is pre-programmed with a building's interior layout. It guides the user using auditory cues whilst avoiding stationary objects. The project was co-funded by the National Institutes of Health’s National Eye Institute (NEI) and the National Institute of Biomedical Imaging and Bioengineering (NIBIB).

For sighted people, technologies like GPS-based applications have revolutionized navigation. We’re interested in creating a device that closes many of the gaps in functionality for white cane users. 

Cang Ye, Professor of Computer Science, Virginia Commonwealth University

Personal Navigation Devices: A Short Roadmap

Early navigators relied on simple observations of their surroundings to get to their destination. Without technology to guide them, they relied on the naked eye. It wasn't until the advent of astronomy and the invention of the compass in the 11th century that humankind turned to technology to navigate. But we had to wait until the digital age to really open up the possibilities.

It wasn't until 1989 that real-time navigation became possible. James Davis and Christopher Schmandt, researchers at the Massachussets Institute of Technology's Media Laboratory, used a machine developed by NEC that communicated over a cellular modem using Symbolics LISP software. The system communicated its directions using a speech synthesizer.4

Then in 1994, the TomTom corporation began developing applications for Personal Digital Assistants (PDAs). They were the first to combine touchscreen technology with a GPS receiver, maps and software in a portable navigation device. Thus, real-time navigation became a reality for consumers.

Today, more than a billion people every month use Google maps from their smartphones. And there are similar numbers of users of Apple maps.

Robotic Navigation Aids

We're all familiar with smartphone-based navigation applications. There are also specialized applications which help the visually impaired navigate certain challenges, such as crossing the road at crosswalks. However, navigating indoors is still a challenge.

Prof. Cang Ye, a professor of Computer Science at Virginia Commonwealth University's College of Engineering, tackled the issue by incorporating floor plans into his robotic navigation canes.

The user would direct the cane using voice commands and the cane would direct the user with its in-built robotic rolling tip sensor. However, as walking distances increased, inaccuracies built up, leading the user to an inaccurate location.

Prof. Ye and his team built a robotic navigation aid (RNA) prototype. They installed an Intel RealSense D435 RGB-D camera and a VectorNav VN-100 inertial measurement unit (IMU). The 3D camera works in a similar way to a smartphone's front-facing camera. Utilizing infrared light, it calculates the distance to objects.

They also developed a depth-enhanced visual-inertial odometry (DVIO) method with six degrees of freedom. This uses a feature tracker, floor detector and state estimator to accurately determine positioning. Thus, the in-built computer determines the user’s exact location relative to the pre-loaded floor plan.

In essence, the system creates a 3D map of the user’s surroundings in real-time. This map is then superimposed onto the pre-loaded two-dimensional floor plan of static objects.

While some cell phone apps can give people auditory navigation instructions, when going around a corner for example, how do you know you’ve turned just the right amount?...The rolling tip on our robotic cane can guide you to turn at just the right point and exactly the right number of degrees, whether it’s 15 degrees or 90. This version can also alert you to overhanging obstacles, which a standard white cane cannot.

Cang Ye, Professor of Computer Science, Virginia Commonwealth University

There are still a few technical challenges that need to be addressed before this robotic cane is ready for production. However, this device can switch between robotic and non-robotic modes. In the latter, it can be used in standard “white cane mode.” This heralds a turning point in the search for independence for the visually impaired and blind who have relied on their white canes for navigating their daily lives.

References and Further Reading

Ye C., et al. (2021) An RGB-D camera based visual positioning system for assistive navigation by a robotic navigation aid. IEEE/CAA J. Autom. Sinica. 2021. 8(8):1389-1400. doi:10.1109/JAS.2021.1004084

Nei.nih.gov. (2021) NIH-funded modern “white cane” brings navigation assistance to the 21st century | National Eye Institute. [online] Available at: https://www.nei.nih.gov/about/news-and-events/news/nih-funded-modern-white-cane-brings-navigation-assistance-21st-century

Media.mit.edu. (2021) Synthetic Speech For Real Time Direction-Giving. [online] Available at: https://www.media.mit.edu/speech/papers/1989/schmandt_IEEE89_synthetic_speach_for_real_time_direction-giving.pdf

Disclaimer: The views expressed here are those of the author expressed in their private capacity and do not necessarily represent the views of AZoM.com Limited T/A AZoNetwork the owner and operator of this website. This disclaimer forms part of the Terms and conditions of use of this website.

William Alldred

Written by

William Alldred

William Alldred is a freelance B2B writer with a bachelor’s degree in Physics from Imperial College, London. William is a firm believer in the power of science and technology to transform society. He’s committed to distilling complex ideas into compelling narratives. Williams’s interests include Particle & Quantum Physics, Quantum Computing, Blockchain Computing, Digital Transformation and Fintech.

Citations

Please use one of the following formats to cite this article in your essay, paper or report:

  • APA

    Alldred, William. (2021, October 26). Robotic Navigation Cane with 3D Camera Helps Visually Impaired. AZoSensors. Retrieved on April 28, 2024 from https://www.azosensors.com/news.aspx?newsID=14747.

  • MLA

    Alldred, William. "Robotic Navigation Cane with 3D Camera Helps Visually Impaired". AZoSensors. 28 April 2024. <https://www.azosensors.com/news.aspx?newsID=14747>.

  • Chicago

    Alldred, William. "Robotic Navigation Cane with 3D Camera Helps Visually Impaired". AZoSensors. https://www.azosensors.com/news.aspx?newsID=14747. (accessed April 28, 2024).

  • Harvard

    Alldred, William. 2021. Robotic Navigation Cane with 3D Camera Helps Visually Impaired. AZoSensors, viewed 28 April 2024, https://www.azosensors.com/news.aspx?newsID=14747.

Tell Us What You Think

Do you have a review, update or anything you would like to add to this news story?

Leave your feedback
Your comment type
Submit

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.