Depth completion is the task of reconstructing dense depth images from sparse LiDAR data. LiDAR depth completion, for which LiDAR data is the only input, is an ill-posed and challenging problem owing to the underlying properties of LiDAR data: extremely few points, presence of discontinuities, and absence of texture information. Accordingly, most approaches are heavily dependent on guided color images, which leads to unsatisfactory results when the color images are degraded. To alleviate the dependency on color images but leverage this information during training, we present a deep convolutional neural network (CNN) consisting of depth and edge CNNs via transferring of knowledge. In order to compensate for the limitations of LiDAR data, we design the edge CNN to learn a gradient depth image from a powerful teacher network through the Knowledge-Distillation method. Since the teacher network is trained with color images, color-embedded information can be obtained in the test phase even if color images are not used as an input. We further propose a Self-Distillation method for transferring the color-embedded features from the edge CNN to the depth CNN. Enforcing the depth features to contain edge information hardly observed in LiDAR data enables the depth CNN to generate more edge-attentive and structure-preserving results. Our novel methods show remarkable results in outdoor and indoor environments for KITTI and NYU-Depth-V2 datasets. Experiments performed with low-channel LiDAR data in KITTI and few depth points in the NYU-Depth-V2 dataset show that our method is robust to data sparsity and applicable in various scenarios.


    Zugriff

    Zugriff prüfen

    Verfügbarkeit in meiner Bibliothek prüfen

    Bestellung bei Subito €


    Exportieren, teilen und zitieren



    Titel :

    LiDAR Depth Completion Using Color-Embedded Information via Knowledge Distillation


    Beteiligte:
    Hwang, Sangwon (Autor:in) / Lee, Junhyeop (Autor:in) / Kim, Woo Jin (Autor:in) / Woo, Sungmin (Autor:in) / Lee, Kyungjae (Autor:in) / Lee, Sangyoun (Autor:in)

    Erschienen in:

    Erscheinungsdatum :

    2022-09-01


    Format / Umfang :

    3841073 byte




    Medientyp :

    Aufsatz (Zeitschrift)


    Format :

    Elektronische Ressource


    Sprache :

    Englisch



    Deterministic Guided LiDAR Depth Map Completion

    Krauss, Bryan / Schroeder, Gregory / Gustke, Marko et al. | IEEE | 2021


    Self-Supervised Depth Completion From Direct Visual-LiDAR Odometry in Autonomous Driving

    Song, Zhenbo / Lu, Jianfeng / Yao, Yazhou et al. | IEEE | 2022


    LIDAR and Monocular Camera Fusion: On-road Depth Completion for Autonomous Driving

    Fu, Chen / Mertz, Christoph / Dolan, John M. | IEEE | 2019


    Real-time Depth Completion using Radar and Camera

    Abdulaaty, Omar / Schroeder, Gregory / Hussein, Ahmed et al. | IEEE | 2022


    Sparse Auxiliary Network for Depth Completion

    GUIZILINI VITOR / AMBRUS RARES A / GAIDON ADRIEN DAVID | Europäisches Patentamt | 2022

    Freier Zugriff