Autonomous navigation in unstructured environments like forest or country roads with dynamic objects remains a challenging task, particularly with respect to the perception of the environment using multiple different sensors. The problem has been addressed from both, the computer vision community as well as from researchers working with laser range finding technology, like the Velodyne HDL-64. Since cameras and LIDAR sensors complement one another in terms of color and depth perception, the fusion of both sensors is reasonable in order to provide color images with depth and reflectance information as well as 3D LIDAR point clouds with color information. In this paper we propose a method for sensor synchronization, especially designed for dynamic scenes, a low-level fusion of the data of both sensors and we provide a solution for the occlusion problem that arises in conjunction with different viewpoints of the fusioned sensors.


    Access

    Check access

    Check availability in my library

    Order at Subito €


    Export, share and cite



    Title :

    Fusing vision and LIDAR - Synchronization, correction and occlusion reasoning




    Publication date :

    2010-06-01


    Size :

    1859641 byte





    Type of media :

    Conference paper


    Type of material :

    Electronic Resource


    Language :

    English



    Fusing Vision and LIDAR - Synchronization, Correction and Occlusion Reasoning, pp. 388-393

    Schneider, S. / Himmelsbach, M. / Luettel, T. et al. | British Library Conference Proceedings | 2010



    Fusing LIDAR and vision for autonomous dirt road following. Incorporating a visual feature into the tentacles approach

    Manz, Michael / Himmelsbach, Michael / Luettel, Thorsten et al. | Tema Archive | 2009


    3D occlusion reasoning for accident avoidance

    KOBASHI ATSUHIDE / WITWICKI STEFAN / OSTAFEW CHRISTOPHER et al. | European Patent Office | 2025

    Free access

    3D Occlusion Reasoning for Accident Avoidance

    KOBASHI ATSUHIDE / WITWICKI STEFAN / OSTAFEW CHRISTOPHER et al. | European Patent Office | 2021

    Free access