Researchers have combined UAV-based LiDAR, thermal, and multispectral data with stacking ensemble machine learning to predict corn aboveground biomass (AGB) across varying growth stages and field conditions.
They found that the fusion approach significantly outperformed traditional methods and volume-based models, particularly in later growth stages, producing more accurate, scalable, and spatially reliable predictions.
Behind the Paper
AGB is a critical indicator of crop health, growth, and yield potential. Reliable estimates support essential decisions in pest control, nutrient management, and productivity forecasting.
While traditional destructive sampling methods offer precision, they are too labor-intensive for large-scale applications. Satellite remote sensing provides broader coverage but is limited by atmospheric interference and lower resolution.
Unmanned aerial vehicles (UAVs) equipped with advanced sensors may be an alternative. LiDAR captures the three-dimensional structure of the crop canopy, multispectral (MS) sensors measure reflectance related to vegetation health, and thermal infrared (TIR) sensors detect canopy temperature and water stress.
When used together, these data streams significantly improve AGB prediction accuracy.
Smart Biomass Estimation
The study, published in Plant Phenomics, evaluated two advanced approaches to predicting corn AGB.
The first was a stacking ensemble machine learning model that integrates predictions from multiple algorithms, and the second was the vegetation index weighted canopy volume model (CVMVI), which combines canopy structure and spectral information.
Using UAV-mounted LiDAR, MS, and TIR sensors alongside ground measurements of AGB and leaf area index (LAI), the team assessed how well each method performed under different irrigation and fertilization conditions and at various corn growth stages. Their goal was to understand which method adapts best across agricultural environments and when each model is most effective.
Download your PDF now!
Field Experiment and Sensor Data Collection
The study was conducted during the 2023 growing season in Henan Province, China. Researchers planted ten corn varieties across 180 plots using a factorial design that included four nitrogen fertilization levels and three irrigation treatments.
UAV flights were conducted during four key growth stages, capturing high-resolution data under cloud-free conditions to ensure consistent lighting.
Ground sampling was performed in tandem with aerial surveys. Representative plants were harvested and oven-dried to determine dry biomass, which was scaled to planting density to calculate AGB. LAI was measured using a canopy analysis system.
Due to logistical constraints, destructive biomass sampling was limited to ten plots in some irrigation zones, which may have affected model calibration in those areas.
Full Sensor Fusion Delivers Best Results
Among all models tested, the stacking deep neural network (StackingDNN), which used fully fused data and combined MS, LiDAR, TIR, and LAI, delivered the highest accuracy. It achieved a coefficient of determination (R2) of 0.86, with a mean absolute error (MAE) of 1.54 tons per hectare and a root mean square error (RMSE) of 2.06 tons per hectare.
When using only MS features, the same model achieved an R2 of 0.75, while the random forest algorithm performed best on LiDAR data alone with an R2 of 0.78. In all cases, the stacking ensemble approach outperformed individual machine learning models, especially when data from multiple sensors were integrated.
To better understand what influenced the model’s predictions, the researchers used SHapley Additive exPlanations (SHAP), an AI interpretability tool. SHAP analysis showed LAI, canopy height, and TIR were the most influential features in prediction accuracy.
Volume Modeling Works Early, Struggles Later
The CVMVI model also showed strong early-stage performance, particularly when corn plants were shorter and spectral saturation had not yet occurred. In the earlier tasseling stages, the model achieved an R2 of around 0.78.
However, its accuracy declined in the mid-to-late growth stages as the canopy became denser and the model struggled to account for increasingly complex structures and reflectance conditions.
Because CVMVI requires fewer input variables and lower computational overhead, the researchers suggest it could be a practical choice for early-stage monitoring. For mid-to-late growth stages, however, machine learning models trained on fused sensor data provided substantially improved performance and adaptability.
Data Fusion Improves Spatial Robustness
The study also evaluated how well the models performed across spatially varied field conditions. Using Moran’s I, a statistical measure of spatial autocorrelation, the researchers found that sensor fusion significantly reduced spatial dependency in prediction errors.
This means that the fused models were better at generalizing across different plots, irrigation regimes, and crop varieties without being overly influenced by localized variation. This finding is especially important for scaling up AGB prediction across larger or more heterogeneous agricultural landscapes, where consistency and accuracy are key to decision-making.
Toward More Intelligent Agriculture
This study combined multi-sensor UAV data with ensemble machine learning to present a reliable, scalable method for improving corn AGB prediction throughout the growing season. It also offered practical guidance for tailoring modeling approaches to specific growth stages and field conditions.
As precision agriculture continues to evolve, these findings may help lay the groundwork for smarter crop monitoring, with more efficient input use and stronger data-driven decisions in farming.
Journal Reference
Xuan, F. et al. (2025). Performance of stacking machine learning and volume model for improving corn above ground biomass prediction. Plant Phenomics, 7(3), 100068. DOI:10.1016/j.plaphe.2025.100068, https://www.sciencedirect.com/science/article/pii/S2643651525000743
Disclaimer: The views expressed here are those of the author expressed in their private capacity and do not necessarily represent the views of AZoM.com Limited T/A AZoNetwork the owner and operator of this website. This disclaimer forms part of the Terms and conditions of use of this website.