Automated Georectification and Mosaicking of UAV-Based Hyperspectral Imagery from Push-Broom Sensors
KAUST DepartmentEnvironmental Science and Engineering Program
Water Desalination and Reuse Research Center (WDRC)
Biological and Environmental Sciences and Engineering (BESE) Division
Permanent link to this recordhttp://hdl.handle.net/10754/660874
MetadataShow full item record
AbstractHyperspectral systems integrated on unmanned aerial vehicles (UAV) provide unique opportunities to conduct high-resolution multitemporal spectral analysis for diverse applications. However, additional time-consuming rectification efforts in postprocessing are routinely required, since geometric distortions can be introduced due to UAV movements during flight, even if navigation/motion sensors are used to track the position of each scan. Part of the challenge in obtaining high-quality imagery relates to the lack of a fast processing workflow that can retrieve geometrically accurate mosaics while optimizing the ground data collection efforts. To address this problem, we explored a computationally robust automated georectification and mosaicking methodology. It operates effectively in a parallel computing environment and evaluates results against a number of high-spatial-resolution datasets (mm to cm resolution) collected using a push-broom sensor and an associated RGB frame-based camera. The methodology estimates the luminance of the hyperspectral swaths and coregisters these against a luminance RGB-based orthophoto. The procedure includes an improved coregistration strategy by integrating the Speeded-Up Robust Features (SURF) algorithm, with the Maximum Likelihood Estimator Sample Consensus (MLESAC) approach. SURF identifies common features between each swath and the RGB-orthomosaic, while MLESAC fits the best geometric transformation model to the retrieved matches. Individual scanlines are then geometrically transformed and merged into a single spatially continuous mosaic reaching high positional accuracies only with a few number of ground control points (GCPs). The capacity of the workflow to achieve high spatial accuracy was demonstrated by examining statistical metrics such as RMSE, MAE, and the relative positional accuracy at 95% confidence level. Comparison against a user-generated georectification demonstrates that the automated approach speeds up the coregistration process by 85%.
CitationAngel, Y., Turner, D., Parkes, S., Malbeteau, Y., Lucieer, A., & McCabe, M. F. (2019). Automated Georectification and Mosaicking of UAV-Based Hyperspectral Imagery from Push-Broom Sensors. Remote Sensing, 12(1), 34. doi:10.3390/rs12010034
SponsorsThe authors thank Matteo Ziliani and Bruno Aragon for their assistance in collecting the UAV data and ancillary measurements.
Except where otherwise noted, this item's license is described as This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution