This paper presents a cooperative unmanned aerial vehicle navigation algorithm that allows a chief vehicle (equipped with inertial and magnetic sensors, a Global Positioning System receiver, and a vision system) to improve its navigation performance (in real time or in postprocessing phase), exploiting line-of-sight measurements from formation-flying deputies equipped with Global Positioning System receivers. The key concept is to integrate differential Global Positioning System and visual tracking information within a sensor fusion algorithm based on the extended Kalman filter. The developed concept and processing architecture are described, with a focus on the filtering algorithm. Then, flight-testing strategy and experimental results are presented. In particular, cooperative navigation output is compared with the estimates provided by the onboard autopilot system of a customized quadrotor. The analysis shows the potential of the developed approach, mainly deriving from the possibility to exploit accurate magnetic- and inertial-independent information.
Satellite and Vision-Aided Sensor Fusion for Cooperative Navigation of Unmanned Aircraft Swarms
Journal of Aerospace Information Systems ; 14 , 6 ; 327-344
2017-06-01
Conference paper , Article (Journal)
Electronic Resource
English
Vision-aided Cooperative Navigation for UAV Swarms
AIAA | 2016
|Vision-aided Cooperative Navigation for UAV Swarms (AIAA 2016-1491)
British Library Conference Proceedings | 2016
|RELATIVE NAVIGATION OF SATELLITE SWARMS
TIBKAT | 2020
|METHOD FOR VISION-AIDED NAVIGATION FOR UNMANNED VEHICLES
European Patent Office | 2017
|