This novel sensor design, which levitates numerous glass microparticles, holds the potential to transform sensing accuracy and efficiency. This innovation could lay the groundwork for advancements in autonomous vehicles, navigation systems, and even the detection of dark matter.
Typically, levitating sensors function by isolating minute particles to observe and measure the effects of external forces, such as acceleration. The precision of such a sensor increases with both a greater quantity of particles available for disturbance and their enhanced isolation from the surrounding environment.
The King’s College design overcomes a long-standing limitation of prior devices by enabling both precise tracking and control of particle clouds comprising multiple sensors. This innovation resolves the previous dilemma of choosing between quickly monitoring one object or slowly monitoring numerous ones.
Often unseen, sensors lie at the heart of much of modern technology and science. More accurate sensors would mean that autonomous vehicles can find their way around far more accurately than before, as they detect minute changes in acceleration and provide self-contained navigation systems that are not beholden to unreliable satellite connections.
James Millen, Director, Quantum Research Centre, King’s College London
“By levitating microparticles in a vacuum, we’ve created a tiny sensor of incredible sensitivity. Our use of cutting-edge technology inspired by how the brain interprets vision allows us to control the motion of the sensors at high-speed, which through a process of cooling would allow us to exploit properties of quantum mechanics to make our sensor even more sensitive. This would enable us to probe the incredibly weak forces involved in detecting gravitational waves or dark matter in the lab,” said Millen.
The study utilizes a neuromorphic, or brain-inspired, Event Vision camera to detect the movement of an array of microparticles suspended within electromagnetic fields. By exclusively identifying microparticle motion, rather than capturing full video frames of the entire field-of-view, the Event Vision camera collects only the necessary information.
An integrated AI algorithm then allows researchers to track the particles' motion, both individually and collectively as a single cloud, to ascertain all forces acting upon them. This approach delivers an unparalleled level of accuracy.
This methodology generates a minimal volume of data, enabling the authors to produce real-time feedback signals for controlling the motion of each particle in the array. By regulating the microparticles' motion, researchers can reduce their energy, effectively cooling and stabilizing their movement.
The minimal energy consumption of these devices presents significant opportunities for scaling the number of levitated particles and integrating the technology onto chips.
Dr. Yugang Ren, formerly a Postdoctoral researcher at King’s and the study's first author, noted: “Because of the low power usage of both our imaging technology and the algorithms we use to track the sensors, implementation onto computer chips could be possible in the next five to ten years. This means everything from environmental monitoring to consumer electronics could benefit from more accurate sensing – whether that be of harmful gases or keeping track of where we are.”
In the future, our approach could help cool particles to below a thousandth of a degree above absolute zero, the lowest possible temperature allowed by quantum physics, eliminating the thermal noise and vibrations which get in the way of a sensor’s accuracy. This would produce a quantum sensor with an accuracy and sensitivity unparalleled by the classical technology we use today.
Dr. Yugang Ren, Postdoctoral Researcher, King’s College London
Journal Reference:
Ren, Y., et al. (2025) Neuromorphic detection and cooling of microparticles in arrays. Nature Communications. DOI:10.1038/s41467-025-65677-0. https://www.nature.com/articles/s41467-025-65677-0