The problem of estimating and predicting position and orientation (pose) of a camera is approached by fusing measurements from inertial sensors (accelerometers and rate gyroscopes) and vision. The sensor fusion approach described in this contribution is based on non-linear filtering of these complementary sensors. This way, accurate and robust pose estimates are available for the primary purpose of augmented reality applications, but with the secondary effect of reducing computation time and improving the performance in vision processing. A real-time implementation of a multi-rate extended Kalman filter is described, using a dynamic model with 22 states, where 12.5 Hz correspondences from vision and 100 Hz inertial measurements are processed. An example where an industrial robot is used to move the sensor unit is presented. The advantage with this configuration is that it provides ground truth for the pose, allowing for objective performance evaluation. The results show that we obtain an absolute accuracy of 2 cm in position and 1° in orientation.


    Access

    Download


    Export, share and cite



    Title :

    Robust Real-Time Tracking by Fusing Measurements from Inertial and Vision Sensors


    Contributors:

    Publication date :

    2007-01-01


    Type of media :

    Paper


    Type of material :

    Electronic Resource


    Language :

    English



    Classification :

    DDC:    629



    Fusing Vision and Inertial Sensors for Robust Runway Detection and Tracking

    Abu-Jbara, Khaled / Sundaramorthi, Ganesh / Claudel, Christian | AIAA | 2018




    Robust Real-time Vision-based Aircraft Tracking From Unmanned Aerial Vehicles.

    Fu, Changhong / Carrio Fernández, Adrián / Olivares Méndez, Miguel Ángel et al. | BASE | 2014

    Free access

    Fusing visual contour tracking with inertial sensing to recover robot egomotion

    Alenyà, Guillem / Martínez Marroquín, Elisa / Torras, Carme | BASE | 2003

    Free access