Most existing LiDAR-Inertial odometry (LIO) systems heavily depend on the assumption of a static environment, which hinders their ability to effectively utilize dynamic objects in enhancing localization. Furthermore, accurate tracking of surrounding objects is crucial for emerging applications, such as autonomous driving and multi-robot collaboration. To this end, we present LIO-LOT, a tightly coupled LiDAR-inertial and multi-object tracking (MOT) system. In our system, each movable object is represented by a 3-D bounding box, derived from an object detector. Subsequently, these objects are associated with historical trajectories through a hybrid information matching strategy that integrates detections and geometric feature information. Considering tracking continuity and motion consistency, the associated objects are further categorized into high-confidence ones and low-confidence ones. Building upon this foundation, these two categories of objects are flexibly coupled with inertial measurement unit (IMU) pre-integration measurements and static scene structures to optimize the poses of both tracked objects and ego-vehicle under a factor graph optimization framework. For implementation, the aggregated information of high- and low-confidence objects are fully leveraged to form the object-related factors, which are hierarchically optimized with LIO, enhancing system stability in cases of poor object detections. The system’s performance is extensively evaluated using the KITTI and nuScenes datasets. Experimental results explicitly demonstrate that the proposed system outperforms other state-of-the-art methods in terms of ego-localization and multi-object tracking, particularly in highly dynamic scenarios with situations of occlusion and truncation.


    Zugriff

    Zugriff prüfen

    Verfügbarkeit in meiner Bibliothek prüfen

    Bestellung bei Subito €


    Exportieren, teilen und zitieren



    Titel :

    LIO-LOT: Tightly-Coupled Multi-Object Tracking and LiDAR-Inertial Odometry


    Beteiligte:
    Li, Xingxing (Autor:in) / Yan, Zhuohao (Autor:in) / Feng, Shaoquan (Autor:in) / Xia, Chunxi (Autor:in) / Li, Shengyu (Autor:in) / Zhou, Yuxuan (Autor:in)


    Erscheinungsdatum :

    01.01.2025


    Format / Umfang :

    6270034 byte




    Medientyp :

    Aufsatz (Zeitschrift)


    Format :

    Elektronische Ressource


    Sprache :

    Englisch



    Unified multi-modal landmark tracking for tightly coupled lidar-visual-inertial odometry

    Wisth, D / Camurri, M / Das, S et al. | BASE | 2022

    Freier Zugriff

    Hierarchical Distribution-Based Tightly-Coupled LiDAR Inertial Odometry

    Wang, Chengpeng / Cao, Zhiqiang / Li, Jianjie et al. | IEEE | 2024


    InLIOM: Tightly-Coupled Intensity LiDAR Inertial Odometry and Mapping

    Wang, Hanqi / Liang, Huawei / Li, Zhiyuan et al. | IEEE | 2024