This small drone repeatedly succeeded in tracking down the source of a chemical smell inside a large, multi-room building, using a stripped-down sensing package that relies on 'machine smell' first and brings in vision only at the very end.
The work, posted as an arXiv preprint, targets one of robotics' persistent odour plumes, which are messy. Airflow breaks them into intermittent filaments, ventilation and obstacles distort them, and with flying robots there’s an extra complication: the drone’s own propellers can stir the air it is trying to read.
Saving this for later? Download a PDF here.
Many unmanned aircraft vehicle (UAV) approaches, therefore, either build gas maps by exhaustively sampling an area or attempt reactive plume tracing with added infrastructure and sensing that can be hard to carry on lightweight platforms.
The UAV's Sense of Smell
In this study, the team built what they call a simulation-to-real “olfactory navigation stack” around a modified DJI Tello. The design uses stereo olfaction: two chemical sensors mounted on antennae-like arms so the drone can compare left-right signals rather than relying on a single reading.
They tested two sensor types, metal-oxide (MOX) and electrochemical (EC), to see whether the approach holds across different sensing media.
Alongside the smell sensors, the drone uses three infrared/time-of-flight units, two pointing down for stabilization and one forward for obstacle avoidance. An additional forward-facing camera is optional, reserved for confirmation close to the target.
Chemical readings are handled by a small olfactory processing unit built around an ESP-32 microcontroller, paired with a PalmSens EmStat Pico potentiostat.
Data is preprocessed and streamed to a laptop ground station, reflecting the limited onboard compute available on the platform and keeping the sensor integration lightweight. The airframe was also mechanically adjusted and ballasted to remain stable under the added payload and wiring.
Not Just 'Following the Gradient'
Rather than building a full concentration map or depending on external localisation, the navigation logic fuses olfactory and inertial signals in an olfactory-inertial odometry (OIO) framework.
The paper emphasises that the control strategy is not simply “follow the gradient”. It works through a small set of plume-tracking behaviours: surging when odour dynamics indicate a favourable direction, casting when the signal is lost, pausing to sample, and landing when termination criteria are met.
To handle the slow, noisy, drift-prone nature of chemical sensors, the system filters readings using Kalman filtering and a divergence signal that the authors describe as akin to temporal-difference ideas, while distinguishing it from reinforcement-learning “TD learning” in the classic sense.
The authors also validated the pipeline in two simulation environments, a Simulink-based flight/control model and a Python/Gymnasium framework for plume dynamics and policy evaluation.
Testing the UAV
The real test was conducted indoors across a multi-room course measuring roughly 200 square metres. An ethanol source, delivered via a diffuser and shaped by realistic airflow including fan-driven turbulence, was hidden in one room.
Runs followed a repeatable protocol, including controlled dispersal time and ventilation between trials to keep conditions comparable. Across 20 trials – five per task condition, spanning MOX versus EC sensing and runs with or without the optional camera – the drone located the source every time.
Did the Chosen Sensor Matter?
Performance varied per sensor type.
With olfaction alone, MOX trials averaged 98.38 seconds (σ = 14.84 s), while EC trials averaged 112.22 seconds (σ = 16.70 s).
Adding vision did not dramatically change the navigation strategy, but it tended to shave only a few seconds off completion by allowing the drone to stop promptly once it was close enough to confirm the source visually.
The paper argues this matters because odour cues can flatten near the target, making it easy for a robot to dither in the final metres as the plume becomes more uniform.
Moving Forward, and the Limits that Need Solving
The authors are careful about limits. The experiments focus on ethanol in a specific indoor environment, with a small number of runs per condition.
They also report that the added payload introduced inertial measurement unit bias, complicating long-term positional tracking, which is one reason they avoid map-building and rely instead on odour-driven behaviours and a statistical termination criterion.
Even with those constraints, the study’s broader aim is pragmatic: a reproducible, open-source platform that other groups can rerun, extend, and stress-test.
The team frames this as an early demonstration of combining stereochemical sensing with optional vision on a UAV across two different sensor types, pointing to a route toward odour-guided autonomy that does not depend on heavy sensing or external infrastructure.
Journal Reference
France K. K., et al. (2026). Chasing Ghosts: A Simulation-to-Real Olfactory Navigation Stack with Optional Vision Augmentation. arXiv preprint arXiv:2602.19577v1.