A Drone that 'Smells' Successfully Finds Chemical Source

*Important notice: This news reports on a paper which has been accepted and is awaiting peer review. arXiv publishes preliminary scientific reports and preprints that are not peer-reviewed and, therefore, should not be regarded as conclusive or treated as established information.

Without a map or GPS, a lightweight drone can track an ethanol plume through a 200m2 indoor course and successfully land at the source. 

Small copter flies under the ceiling indoors. Drone flying over the floor. Study: Chasing Ghosts: A Simulation-to-Real Olfactory Navigation Stack with Optional Vision Augmentation. Image Credit: KeyStock/Shutterstock.com

This small drone repeatedly succeeded in tracking down the source of a chemical smell inside a large, multi-room building, using a stripped-down sensing package that relies on 'machine smell' first and brings in vision only at the very end.

The work, posted as an arXiv preprint, targets one of robotics' persistent odour plumes, which are messy. Airflow breaks them into intermittent filaments, ventilation and obstacles distort them, and with flying robots there’s an extra complication: the drone’s own propellers can stir the air it is trying to read.

Saving this for later? Download a PDF here.

Many unmanned aircraft vehicle (UAV) approaches, therefore, either build gas maps by exhaustively sampling an area or attempt reactive plume tracing with added infrastructure and sensing that can be hard to carry on lightweight platforms.

The UAV's Sense of Smell

In this study, the team built what they call a simulation-to-real “olfactory navigation stack” around a modified DJI Tello. The design uses stereo olfaction: two chemical sensors mounted on antennae-like arms so the drone can compare left-right signals rather than relying on a single reading.

They tested two sensor types, metal-oxide (MOX) and electrochemical (EC), to see whether the approach holds across different sensing media.

Alongside the smell sensors, the drone uses three infrared/time-of-flight units, two pointing down for stabilization and one forward for obstacle avoidance. An additional forward-facing camera is optional, reserved for confirmation close to the target.

Chemical readings are handled by a small olfactory processing unit built around an ESP-32 microcontroller, paired with a PalmSens EmStat Pico potentiostat.

Data is preprocessed and streamed to a laptop ground station, reflecting the limited onboard compute available on the platform and keeping the sensor integration lightweight. The airframe was also mechanically adjusted and ballasted to remain stable under the added payload and wiring.

Not Just 'Following the Gradient'

Rather than building a full concentration map or depending on external localisation, the navigation logic fuses olfactory and inertial signals in an olfactory-inertial odometry (OIO) framework.

The paper emphasises that the control strategy is not simply “follow the gradient”. It works through a small set of plume-tracking behaviours: surging when odour dynamics indicate a favourable direction, casting when the signal is lost, pausing to sample, and landing when termination criteria are met.

To handle the slow, noisy, drift-prone nature of chemical sensors, the system filters readings using Kalman filtering and a divergence signal that the authors describe as akin to temporal-difference ideas, while distinguishing it from reinforcement-learning “TD learning” in the classic sense.

The authors also validated the pipeline in two simulation environments, a Simulink-based flight/control model and a Python/Gymnasium framework for plume dynamics and policy evaluation.

Testing the UAV

The real test was conducted indoors across a multi-room course measuring roughly 200 square metres. An ethanol source, delivered via a diffuser and shaped by realistic airflow including fan-driven turbulence, was hidden in one room.

Runs followed a repeatable protocol, including controlled dispersal time and ventilation between trials to keep conditions comparable. Across 20 trials – five per task condition, spanning MOX versus EC sensing and runs with or without the optional camera – the drone located the source every time.

Did the Chosen Sensor Matter?

Performance varied per sensor type. 

With olfaction alone, MOX trials averaged 98.38 seconds (σ = 14.84 s), while EC trials averaged 112.22 seconds (σ = 16.70 s).

Adding vision did not dramatically change the navigation strategy, but it tended to shave only a few seconds off completion by allowing the drone to stop promptly once it was close enough to confirm the source visually. 

The paper argues this matters because odour cues can flatten near the target, making it easy for a robot to dither in the final metres as the plume becomes more uniform.

Moving Forward, and the Limits that Need Solving

The authors are careful about limits. The experiments focus on ethanol in a specific indoor environment, with a small number of runs per condition.

They also report that the added payload introduced inertial measurement unit bias, complicating long-term positional tracking, which is one reason they avoid map-building and rely instead on odour-driven behaviours and a statistical termination criterion.

Even with those constraints, the study’s broader aim is pragmatic: a reproducible, open-source platform that other groups can rerun, extend, and stress-test.

The team frames this as an early demonstration of combining stereochemical sensing with optional vision on a UAV across two different sensor types, pointing to a route toward odour-guided autonomy that does not depend on heavy sensing or external infrastructure.

Journal Reference

France K. K., et al. (2026). Chasing Ghosts: A Simulation-to-Real Olfactory Navigation Stack with Optional Vision Augmentation. arXiv preprint arXiv:2602.19577v1.  

Dr. Noopur Jain

Written by

Dr. Noopur Jain

Dr. Noopur Jain is an accomplished Scientific Writer based in the city of New Delhi, India. With a Ph.D. in Materials Science, she brings a depth of knowledge and experience in electron microscopy, catalysis, and soft materials. Her scientific publishing record is a testament to her dedication and expertise in the field. Additionally, she has hands-on experience in the field of chemical formulations, microscopy technique development and statistical analysis.    

Citations

Please use one of the following formats to cite this article in your essay, paper or report:

  • APA

    Jain, Noopur. (2026, March 05). A Drone that 'Smells' Successfully Finds Chemical Source. AZoSensors. Retrieved on March 05, 2026 from https://www.azosensors.com/news.aspx?newsID=16786.

  • MLA

    Jain, Noopur. "A Drone that 'Smells' Successfully Finds Chemical Source". AZoSensors. 05 March 2026. <https://www.azosensors.com/news.aspx?newsID=16786>.

  • Chicago

    Jain, Noopur. "A Drone that 'Smells' Successfully Finds Chemical Source". AZoSensors. https://www.azosensors.com/news.aspx?newsID=16786. (accessed March 05, 2026).

  • Harvard

    Jain, Noopur. 2026. A Drone that 'Smells' Successfully Finds Chemical Source. AZoSensors, viewed 05 March 2026, https://www.azosensors.com/news.aspx?newsID=16786.

Tell Us What You Think

Do you have a review, update or anything you would like to add to this news story?

Leave your feedback
Your comment type
Submit

Sign in to keep reading

We're committed to providing free access to quality science. By registering and providing insight into your preferences you're joining a community of over 1m science interested individuals and help us to provide you with insightful content whilst keeping our service free.

or

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.