3,043
Views
1
CrossRef citations to date
0
Altmetric
Articles

UAV navigation system using line-based sensor pose estimation

ORCID Icon &
Pages 2-11 | Received 12 Jan 2017, Accepted 06 Nov 2017, Published online: 12 Jan 2018
 

Abstract

This work presents a mapping and tracking system based on images to enable a small Unmanned Aerial Vehicle (UAV) to accurately navigate in indoor and GPS-denied outdoor environments. A method is proposed to estimate the UAV’s pose (i.e., the 3D position and orientation of the camera sensor) in real-time using only the on-board RGB camera as the UAV travels through a known 3D environment (i.e., a 3D CAD model). Linear features are extracted and automatically matched between images collected by the UAV’s onboard RGB camera and the 3D object model. The matched lines from the 3D model serve as ground control to estimate the camera pose in real-time via line-based space resection. The results demonstrate that the proposed model-based pose estimation algorithm provides sub-meter positioning accuracies in both indoor and outdoor environments. It is also that shown the proposed method can provide sparse updates to correct the drift from complementary simultaneous localization and mapping (SLAM)-derived pose estimates.

Acknowledgments

We thank the Planning & Renovations, Campus Services & Business Operations at York University for providing the 3D model of the Bergeron Centre of Excellence in Engineering. Special thanks are given to the members of York University’s GeoICT Lab, Damir Gumerov, Yoonseok Jwa, Brian Kim, and Yu Gao, who contributed to the generation of the York University 3D Virtual Campus Model. We also thank Professor James Elder’s Human & Computer Vision Lab in York University’s Centre for Vision Research for providing the UAV video data.