To properly align objects in the real and virtual world in an augmented reality (AR) space, it is essential to keep tracking camera's exact 3D position and orientation, which is well known as the Registration problem. Traditional vision based or inertial sensor based solutions are mostly designed for well-structured environment, which is, however, unavailable for outdoor uncontrolled road navigation applications. This paper proposed a hybrid camera pose tracking system that combines vision, GPS and 3D inertial gyroscope technologies. The fusion approach is based on our PMM (parameterized model matching) algorithm, in which the road shape model is derived from the digital map referring to GPS absolute road position, and matches with road features extracted from the real image. Inertial data estimates the initial possible motion, and also serves as the relative tolerance to stabilize output. The algorithms proposed in this paper are validated with the experimental results of real road tests under different conditions and types of road.
Real-time data fusion on tracking camera pose for direct visual guidance
2004-01-01
866494 byte
Conference paper
Electronic Resource
English
TPP1.19 Real-Time Data Fusion on Tracking Camera Pose for Direct Visual Guidance
British Library Conference Proceedings | 2004
|A performance study for camera pose estimation using visual marker based tracking
British Library Online Contents | 2010
|Real time tracking of borescope tip pose
IEEE | 1996
|