This paper proposes a robust and safe perception system for an odometry framework based on the fusion of LiDAR data with an RGB image. These multi-modal sensor measurements are fused using their depth proposals and confidence measures in a Bayesian inference module. The resulting fused depth map enhances unsupervised odometry estimates. Experimental results show that the LiDAR-camera fused depth map is an accurate 3D structure representation of the environment. This method can be used in an online adaption of the learning-based odometry algorithms to increase their generalizability to different scenes. We perform experiments on the benchmark odometry datasets and obtain promising results compared to the previous approaches. Compared to the state-of-the-art methods, the average translation error shows a 44% reduction, and the average rotation error is better or comparable.


    Access

    Check access

    Check availability in my library

    Order at Subito €


    Export, share and cite



    Title :

    LiDAR-Camera Fusion for Depth Enhanced Unsupervised Odometry


    Contributors:


    Publication date :

    2022-06-01


    Size :

    2834306 byte





    Type of media :

    Conference paper


    Type of material :

    Electronic Resource


    Language :

    English



    IGICP: Intensity and Geometry Enhanced LiDAR Odometry

    He, Li / Li, Wen / Guan, Yisheng et al. | IEEE | 2024


    HPPLO-Net: Unsupervised LiDAR Odometry Using a Hierarchical Point-to-Plane Solver

    Zhou, Beibei / Tu, Yiming / Jin, Zhong et al. | IEEE | 2024


    DeLiO: Decoupled LiDAR Odometry

    Thomas, Queens Maria / Wasenmuller, Oliver / Stricker, Didier | IEEE | 2019


    LiDAR - Stereo Camera Fusion for Accurate Depth Estimation

    Cholakkal, Hafeez Husain / Mentasti, Simone / Bersani, Mattia et al. | IEEE | 2020


    Self-Supervised Depth Completion From Direct Visual-LiDAR Odometry in Autonomous Driving

    Song, Zhenbo / Lu, Jianfeng / Yao, Yazhou et al. | IEEE | 2022