Estimating the traversability of terrain in an unstructured outdoor environment is a core functionality for autonomous robot navigation. While general-purpose sensing can be used to identify the existence of terrain features such as vegetation and sloping ground, the traversability of these regions is a complex function of the terrain characteristics and vehicle capabilities, which makes it extremely difficult to characterize a priori. Moreover, it is difficult to find general rules which work for a wide variety of terrain types such as trees, rocks, tall grass, logs, and bushes. As a result, methods which provide traversability estimates based on predefined terrain properties such as height or shape will be unlikely to work reliably in unknown outdoor environments. Our approach is based on the observation that traversability in the most general sense is an affordance which is jointly determined by the vehicle and its environment. We describe a novel on-line learning method which can make accurate predictions of the traversability properties of complex terrain. Our method is based on autonomous training data collection which exploits the robot's experience in navigating its environment to train classifiers without human intervention. This is in contrast to other learning methods in which training data is collected manually. We have implemented and tested our traversability learning method on an unmanned ground vehicle (UGV) and evaluated its performance in several realistic outdoor environments. The experiments quantify the benefit of our on-line traversability learning approach.
Traversability classification using unsupervised on-line visual learning for outdoor robot navigation
2006
8 Seiten, 18 Quellen
Conference paper
English
Autonomous Robot Navigation Using Traversability Indices
British Library Conference Proceedings | 2004
|Towards learned traversability for robot navigation: From underfoot to the far field
British Library Online Contents | 2006
|Improving Traversability Estimation Through Autonomous Robot Experimentation
BASE | 2019
|