An integrated sensing approach that fuses vision and range information to land an autonomous class 1 unmanned aerial system (UAS) controlled by e-modification model reference adaptive control is presented. The navigation system uses a feature detection algorithm to locate features and compute the corresponding range vectors on a coarsely instrumented landing platform. The relative translation and rotation state is estimated and sent to the flight computer for control feedback. A robust adaptive control law that guarantees uniform ultimate boundedness of the adaptive gains in the presence of bounded external disturbances is used to control the flight vehicle. Experimental flight tests are conducted to validate the integration of these systems and measure the quality of result from the navigation solution. Robustness of the control law amidst flight disturbances and hardware failures is demonstrated. The research results demonstrate the utility of low-cost, low-weight navigation solutions for navigation of small, autonomous UAS to carryout littoral proximity operations about unprepared shipdecks.
A CAMERA AND RANGE SENSOR FUSION APPROACH FOR AUTONOMOUS NAVIGATION SYSTEMS DRIVEN BY ROBUST ADAPTIVE CONTROL
Proceedings of the 44th Annual American Astronautical Society Guidance, Navigation, and Control Conference, 2022 ; Chapter : 62 ; 1123-1144
2024-01-01
22 pages
Article/Chapter (Book)
Electronic Resource
English
DSpace@MIT | 2002
|RANGE-Robust autonomous navigation in GPS-denied environments
British Library Online Contents | 2011
|