Posted in | News | Sensors General

Review: Adaptive Microscopy Could Make Biological Imaging More Precise

Smart microscopes can make decisions based on real-time feedback - a new review argues they could change how biologists capture life at its most dynamic.

Stock photo of male scientist using the computer and a microscopy in his lab.Study: Smart microscopy: adaptive microscope control to improve the way we see life. Image Credit: viviana loza/Shutterstock.com

Microscopes are largely passive recording tools - they record what they are asked to see. But in modern life-science imaging, that is beginning to change with a growing class of instruments called smart microscopes. These tools can respond during analysis, adjusting how they image based on what the sample is doing.

The review in npj Imaging sets out a practical framework for what exactly counts as 'smart microscopy'. Their definition is those microscopes that combine real-time image analysis, feedback, and automated control to alter imaging conditions while data is still being collected.

Saving this for later? Grab a PDF here.

Traditional automation can set up an experiment in advance, using preselected parameters to run imaging with minimal user input. Smart microscopy takes this a step further. Its defining feature is adaptation: a smart microscope analyzes incoming data, decides whether something needs to change, and then acts on that decision during the experiment.

That capability is possible for several reasons. Faster computing, more sensitive detectors, better motorized hardware, and fluorescent labeling tools such as green fluorescent protein have all made it easier to monitor living systems in detail.

Open-source software platforms, such as MicroManager, have also helped researchers connect hardware, automate workflows, and build more flexible imaging systems.

The review looks at sensors within this broader decision-making framework rather than as the whole story. Cameras, photodetectors, and related hardware provide the signals that feed the system, while control algorithms and actuators determine how the microscope responds.

Improvements in sensitivity, speed, and dynamic range have been especially important for live imaging, where researchers need to collect useful data without causing too much phototoxicity or photobleaching.

In adaptive optics, wavefront correction tools such as deformable mirrors can also be used to counter aberrations, improving image quality in challenging conditions, such as deep-tissue imaging.

Smart Microscopy and Experiment Design

The review argues that smart microscopy should be thought of as a way of designing experiments, rather than as a single technology.

Instead of collecting as much data as possible and deciding later what matters, these systems can make that judgment during acquisition. That can mean protecting fragile samples, prioritizing rare events, or keeping a moving target in view over long periods.

The authors group smart microscopy into five broad categories: quality-driven, event-driven, target-driven, information-driven, and outcome-driven. The categories are based on the experimental goal, not on any one imaging modality, and a single system can span more than one of them.

Quality-driven microscopy focuses on maintaining or improving image quality while data are being acquired.

In one example discussed in the review, Royer and colleagues’ AutoPilot system for light-sheet microscopy adjusts the position and angle of illumination and detection in living organisms to keep the light sheet properly aligned, compensating for specimen heterogeneity and movement.

Event-driven microscopy shifts the emphasis from constant observation to selective attention. Rather than imaging continuously at high speed, the system watches for a biologically important event and only intensifies acquisition when that event occurs.

The review highlights work by Mahecic and colleagues, who used deep neural networks to identify mitochondrial division events in COS7 cells. Their approach began with slower imaging to limit light exposure, then switched to faster imaging once division was detected.

Target-driven microscopy is designed to keep a biological feature, object, or region of interest in frame and in focus over time. As an early example, Rabut and Ellenberg tracked single cells in 3D by using image data to estimate position and maintain focus throughout the experiment.

The final two categories move closer to decision-making at the experimental level.

Information-driven microscopy uses existing knowledge to decide which new information to collect next, helping avoid redundant imaging. Outcome-driven microscopy goes a step further by placing imaging within a closed-loop system that uses feedback and perturbations to push biology toward a defined state.

In these cases, machine learning models may be used to infer biological states from image data and to adjust stimulation or experimental conditions accordingly.

Technical Challenges of Implementation

Behind those experimental goals, the review describes a broader implementation framework. A smart microscope typically integrates image acquisition, information extraction, control logic, and actuation.

Real-time analysis methods can include segmentation, detection, prediction, tracking, and classification. Those outputs then feed into control strategies, whether open-loop, closed-loop, or adaptive, before being translated into actions by actuators. The review groups those actuators into categories including optomechanics, perturbations, storage, and communication.

That architecture makes clear why building a genuinely smart microscope remains difficult. The challenge is not only collecting images, but linking multiple hardware and software layers into a system that can operate fast enough, reliably enough, and clearly enough for experimental use.

Interoperability remains a major obstacle. Imaging platforms often combine components from different manufacturers, each with its own software environments, metadata standards, and control logic.

Processing in real-time is another challenge, particularly as high-speed and multispectral imaging produce increasingly large volumes of data. Those demands push up the need for storage, computational power, and AI-assisted analysis.

The review also notes that biological variability can complicate feedback-based imaging. A system that performs well on one sample may not generalize neatly to another, reinforcing the need for modular tools and experiment-specific tuning.

And as artificial intelligence becomes more deeply embedded in microscope control, the authors point to a further issue: researchers will need to navigate both user bias and algorithmic bias. Explainable AI, they suggest, could help make these systems easier to interpret and trust.

A Field Still Taking Shape

Although the technical hurdles are substantial, the review presents smart microscopy as a field moving from isolated demonstrations toward a more organized research community.

Open-source development, shared software infrastructure, and initiatives such as smartmicroscopy.org are helping researchers exchange tools and methods. Collaborations linked to Euro-BioImaging are also working to improve standards and lower adoption barriers.

That broader push may prove as important as any single imaging advance. Smart microscopy depends not just on better instruments, but on systems that are interoperable, usable, and accessible beyond specialist engineering groups.

Conclusion

The review presents smart microscopy as a shift in how imaging experiments are conducted: away from passive data collection and toward adaptive systems that respond to biology as it unfolds.

In combining real-time analysis, feedback, and automated control, these microscopes can improve image quality, reduce unnecessary light exposure, and capture short-lived events more efficiently.

Just as importantly, the framework offered by the authors provides the field with clearer language for describing what different smart microscopy systems aim to achieve and how they do so. If those tools become easier to build and use, adaptive imaging could become a much more routine part of biological research rather than a specialist capability.

Journal Reference

Rates A., et al. (2026). Smart microscopy: adaptive microscope control to improve the way we see life. npj Imaging 4, 14. DOI: 10.1038/s44303-026-00145-y

Dr. Noopur Jain

Written by

Dr. Noopur Jain

Dr. Noopur Jain is an accomplished Scientific Writer based in the city of New Delhi, India. With a Ph.D. in Materials Science, she brings a depth of knowledge and experience in electron microscopy, catalysis, and soft materials. Her scientific publishing record is a testament to her dedication and expertise in the field. Additionally, she has hands-on experience in the field of chemical formulations, microscopy technique development and statistical analysis.    

Citations

Please use one of the following formats to cite this article in your essay, paper or report:

  • APA

    Jain, Noopur. (2026, March 10). Review: Adaptive Microscopy Could Make Biological Imaging More Precise. AZoSensors. Retrieved on March 10, 2026 from https://www.azosensors.com/news.aspx?newsID=16790.

  • MLA

    Jain, Noopur. "Review: Adaptive Microscopy Could Make Biological Imaging More Precise". AZoSensors. 10 March 2026. <https://www.azosensors.com/news.aspx?newsID=16790>.

  • Chicago

    Jain, Noopur. "Review: Adaptive Microscopy Could Make Biological Imaging More Precise". AZoSensors. https://www.azosensors.com/news.aspx?newsID=16790. (accessed March 10, 2026).

  • Harvard

    Jain, Noopur. 2026. Review: Adaptive Microscopy Could Make Biological Imaging More Precise. AZoSensors, viewed 10 March 2026, https://www.azosensors.com/news.aspx?newsID=16790.

Tell Us What You Think

Do you have a review, update or anything you would like to add to this news story?

Leave your feedback
Your comment type
Submit

Sign in to keep reading

We're committed to providing free access to quality science. By registering and providing insight into your preferences you're joining a community of over 1m science interested individuals and help us to provide you with insightful content whilst keeping our service free.

or

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.