GPS by itself is not dependable in urban environments, due to signal reception issues such as multi-path effects or occlusion. Other sensor data is required to keep track of the vehicle in absence of a reliable GPS signal. We propose a new method to use a single on-board consumer-grade camera for vehicle motion estimation. The method is based on the tracking of ground plane features, taking into account the uncertainty on their backprojection as well as the uncertainty on the vehicle motion. A Hough-like parameter space vote is employed to extract motion parameters from the uncertainty models. The method is easy to calibrate and designed to be robust to outliers and bad feature quality. Experimental results show good accuracy and high reliability, with a positional estimate within 2 metres for a 400 metre elapsed distance.


    Access

    Check access

    Check availability in my library

    Order at Subito €


    Export, share and cite



    Title :

    Robust monocular visual odometry by uncertainty voting


    Contributors:
    Van Hamme, D. (author) / Veelaert, P. (author) / Philips, W. (author)


    Publication date :

    2011-06-01


    Size :

    1743990 byte





    Type of media :

    Conference paper


    Type of material :

    Electronic Resource


    Language :

    English



    Robust Monocular Visual Odometry by Uncertainty Voting

    Van Hamme, D. / Peter, V. / Philips, W. et al. | British Library Conference Proceedings | 2011


    Robust stereo visual odometry from monocular techniques

    Persson, Mikael / Piccini, Tommaso / Felsberg, Michael et al. | IEEE | 2015


    Ground Vehicle Monocular Visual Odometry

    Sabry, Mohamed / Al-Kaff, Abdulla / Hussein, Ahmed et al. | IEEE | 2019


    Uncertainty-Aware Attention Guided Sensor Fusion For Monocular Visual Inertial Odometry

    Shinde, Kashmira | German Aerospace Center (DLR) | 2020

    Free access

    VIDO: A Robust and Consistent Monocular Visual-Inertial-Depth Odometry

    Gao, Yuanxi / Yuan, Jing / Jiang, Jingqi et al. | IEEE | 2023