Feature point matching is a critical step to visual odometry (VO) computation and many other vision applications. Frame-to-frame ego-motion drift caused by feature mismatching is the main challenge for VO. This paper presents a VO algorithm that uses a newly developed feature descriptor called synthetic basis descriptor to obtain accurate feature matching and reduce the drift. An initial estimate of the camera motion is calculated using matching feature pairs. Feature points in the current frame are then transformed to the next frame using this initial estimate of camera motion. The sample means between the matched points and the transformed points in the next frame are used to obtain the final estimate of camera motion to reduce the drift or re-projection error. Our algorithm uses a sliding window approach to extend feature transformation into subsequent frames to overcome the limitation of the short baseline nature of VO. The accuracy of the proposed system is evaluated and compared with competent VO methods along with ground truth (GPS $+$ inertial measurement unit data).


    Access

    Check access

    Check availability in my library

    Order at Subito €


    Export, share and cite



    Title :

    Visual Odometry Drift Reduction Using SYBA Descriptor and Feature Transformation


    Contributors:
    Desai, Alok (author) / Lee, Dah-Jye (author)


    Publication date :

    2016-07-01


    Size :

    3854704 byte




    Type of media :

    Article (Journal)


    Type of material :

    Electronic Resource


    Language :

    English





    AIRCRAFT-BASED VISUAL-INERTIAL ODOMETRY WITH RANGE MEASUREMENT FOR DRIFT REDUCTION

    HEWITT ROBERT A / IZRAELEVITZ JACOB / RUFFATTO DONALD F et al. | European Patent Office | 2022

    Free access

    Feature Tracking For Visual Odometry

    HUNT SHAWN | European Patent Office | 2019

    Free access

    Drift Reduction for Monocular Visual Odometry of Intelligent Vehicles Using Feedforward Neural Networks

    Wagih, Hassan / Osman, Mostafa / Awad, Mohammed I. et al. | IEEE | 2022