During a pre-programmed course to a particular destination, an autonomous vehicle may potentially encounter environments that are unknown at the time of operation. Some regions may contain objects or vehicles that were not anticipated during the mission-planning phase. Often user-intervention is not possible or desirable under these circumstances. Thus it is required for the onboard navigation system to automatically make short-term adjustments to the flight plan and to apply the necessary course corrections. A suitable path is visually navigated through the environment to reliably avoid obstacles without significant deviations from the original course. This paper describes a general low-cost stereo-vision sensor framework, for passively estimating the range-map between a forward-looking autonomous vehicle and its environment. Typical vehicles may be either unmanned ground or airborne vehicles. The range-map image describes a relative distance from the vehicle to the observed environment and contains information that could be used to compute a navigable flight plan, and also visual and geometric detail about the environment for other onboard processes or future missions. Aspects relating to information flow through the framework are discussed, along with issues such as robustness, implementation and other advantages and disadvantages of the framework. An outline of the physical structure of the system is presented and an overview of the algorithms and applications of the framework are given.
Stereo-vision framework for autonomous vehicle guidance and collision avoidance
Location Services and Navigation Technologies ; 2003 ; Orlando,Florida,United States
Proc. SPIE ; 5084
2003-08-06
Conference paper
Electronic Resource
English
Stereo-vision framework for autonomous vehicle guidance and collision avoidance [5084-15]
British Library Conference Proceedings | 2003
|Computer Vision based Animal Collision Avoidance Framework for Autonomous Vehicles
ArXiv | 2020
|