Vision-based SLAM system for unmanned aerial vehicles
Tipus de documentArticle
Condicions d'accésAccés obert
Projecte de la Comissió EuropeaAErial RObotic system integrating multiple ARMS and advanced manipulation capabilities for inspection and maintenance (EC-H2020-644271)
The present paper describes a vision-based simultaneous localization and mapping system to be applied to Unmanned Aerial Vehicles (UAVs). The main contribution of this work is to propose a novel estimator relying on an Extended Kalman Filter. The estimator is designed in order to fuse the measurements obtained from: (i) an orientation sensor (AHRS); (ii) a position sensor (GPS); and (iii) a monocular camera. The estimated state consists of the full state of the vehicle: position and orientation and their first derivatives, as well as the location of the landmarks observed by the camera. The position sensor will be used only during the initialization period in order to recover the metric scale of the world. Afterwards, the estimated map of landmarks will be used to perform a fully vision-based navigation when the position sensor is not available. Experimental results obtained with simulations and real data show the benefits of the inclusion of camera measurements into the system. In this sense the estimation of the trajectory of the vehicle is considerably improved, compared with the estimates obtained using only the measurements from the position sensor, which are commonly low-rated and highly noisy.
CitacióMunguia, R.F., Urzua, S., Bolea, Y., Grau, A. Vision-based SLAM system for unmanned aerial vehicles. "Sensors", Març 2016, vol. 16, núm. 3.
Versió de l'editorhttp://www.mdpi.com/1424-8220/16/3/372