A Robust Vision-based Runway Detection and Tracking Algorithm for Automatic UAV Landing

Abstract
This work presents a novel real-time algorithm for runway detection and tracking applied to the automatic takeoff and landing of Unmanned Aerial Vehicles (UAVs). The algorithm is based on a combination of segmentation based region competition and the minimization of a specific energy function to detect and identify the runway edges from streaming video data. The resulting video-based runway position estimates are updated using a Kalman Filter, which can integrate other sensory information such as position and attitude angle estimates to allow a more robust tracking of the runway under turbulence. We illustrate the performance of the proposed lane detection and tracking scheme on various experimental UAV flights conducted by the Saudi Aerospace Research Center. Results show an accurate tracking of the runway edges during the landing phase under various lighting conditions. Also, it suggests that such positional estimates would greatly improve the positional accuracy of the UAV during takeoff and landing phases. The robustness of the proposed algorithm is further validated using Hardware in the Loop simulations with diverse takeoff and landing videos generated using a commercial flight simulator.

Citation
Abu Jbara, K. F. (2015). A Robust Vision-based Runway Detection and Tracking Algorithm for Automatic UAV Landing. KAUST Research Repository. https://doi.org/10.25781/KAUST-CR45R

DOI
10.25781/KAUST-CR45R

Permanent link to this record