During the past decade, aerial robots have seen an unprecedented expansion in their utility as they take on more tasks which had typically been reserved for humans. With an ever widening domain of aerial robotic applications, including many mission critical tasks such as disaster response operations, search and rescue missions and infrastructure inspections taking place in GPS-denied environments, the need for reliable autonomous operation of aerial robots has become crucial. To accomplish their tasks, aerial robots operating in GPS-denied areas rely on a multitude of sensors to localize and navigate. Visible spectrum camera systems correspond to the most commonly used sensing modality due to their low cost and weight rendering them suitable for small aerial robots in indoor or broadly GPS-denied settings. However, in environments that are visually-degraded such as in conditions of poor illumination, low texture, or presence of obscurants including fog, smoke and dust, the reliability of visible light cameras deteriorates significantly. Nevertheless, maintaining reliable robot navigation in such conditions is essential if the robot is to perform many of the critical applications listed above. In contrast to visible light cameras, thermal cameras offer visibility in the infrared spectrum and can be used in a complementary manner with visible spectrum cameras for robot localization and navigation tasks, without paying the significant weight and power penalty typically associated with carrying other sensors such as 3D LiDARs or a RADAR. Exploiting this fact, in this work we present a multi-sensor fusion algorithm for reliable odometry estimation in GPS-denied and degraded visual environments. The proposed method utilizes information from both the visible and thermal spectra for landmark selection and prioritizes feature extraction from informative image regions based on a metric over spatial entropy. Furthermore, inertial sensing cues are integrated to improve the robustness of the odometry estimation process. The proposed method works in real-time, fully on-board an aerial robot. To verify our solution, a set of challenging experiments were conducted inside a) an obscurant-filed machine shop-like industrial environment, as well as b) a dark subterranean mine in the presence of heavy airborne dust.
Visual-Thermal Landmarks and Inertial Fusion for Navigation in Degraded Visual Environments
01.03.2019
3128996 byte
Aufsatz (Konferenz)
Elektronische Ressource
Englisch
Autonomous aerial navigation using monocular visual‐inertial fusion
British Library Online Contents | 2018
|