An autonomous vehicle must accurately observe its location within the environment to interact with objects and accomplish its mission. When its environment is unknown, the vehicle must construct a map detailing its surroundings while using it to maintain an accurate location. Such a vehicle is faced with the circularly defined Simultaneous Localization and Mapping (SLAM) problem. However difficult, SLAM is a critical component of autonomous vehicle exploration with applications to search and rescue. To current knowledge, this research presents the first SLAM solution to integrate stereo cameras, inertial measurements, and vehicle odometry into a Multiple Integrated Navigation Sensor (MINS) path. The implementation combines the MINS path with LIDAR to observe and map the environment using the FastSLAM algorithm. In real-world tests, a mobile ground vehicle equipped with these sensors completed a 140 meter loop around indoor hallways. This SLAM solution produces a path that closes the loop and remains within 1 meter of truth, reducing the error 92% from an image- inertial navigation system and 79% from odometry FastSLAM.
Multiple Integrated Navigation Sensors for Improved Occupancy Grid FastSLAM
2011
118 pages
Report
No indication
English
Infrared & Ultraviolet Detection , Navigation, Guidance, & Control , Optical detectors , Algorithms , Position(Location) , Theses , Navigation , Ground vehicles , Test and evaluation , Grids , Bayesian filtering , Mobile robots , Machine vision , Vehicle navigation , Slam(Simultaneous localization and mapping) , Mins(Multiple integrated navigation sensor)
Convergence and Consistency Analysis for FastSLAM
British Library Conference Proceedings | 2009
|FASTSLAM FILTER IMPLEMENTATION FOR INDOOR AUTONOMOUS ROBOT
British Library Conference Proceedings | 2016
|Convergence and consistency analysis for FastSLAM
IEEE | 2009
|