Online multi-object tracking (MOT) is extremely important for high-level spatial reasoning and path planning for autonomous and highly-automated vehicles. In this paper, we present a modular framework for tracking multiple objects (vehicles), capable of accepting object proposals from different sensor modalities (vision and range) and a variable number of sensors, to produce continuous object tracks. This work is a generalization of the MDP framework for MOT proposed by Xiang et al., with some key extensions - First, we track objects across multiple cameras and across different sensor modalities. This is done by fusing object proposals across sensors accurately and efficiently. Second, the objects of interest (targets) are tracked directly in the real world. This is a departure from traditional techniques where objects are simply tracked in the image plane. Doing so allows the tracks to be readily used by an autonomous agent for navigation and related tasks. To verify the effectiveness of our approach, we test it on real world highway data collected from a heavily sensorized testbed capable of capturing full-surround information. We demonstrate that our framework is well-suited to track objects through entire maneuvers around the ego-vehicle, some of which take more than a few minutes to complete. We also leverage the modularity of our approach by comparing the effects of including/excluding different sensors, changing the total number of sensors, and the quality of object proposals on the final tracking result.


    Zugriff

    Zugriff prüfen

    Verfügbarkeit in meiner Bibliothek prüfen

    Bestellung bei Subito €


    Exportieren, teilen und zitieren



    Titel :

    No Blind Spots: Full-Surround Multi-Object Tracking for Autonomous Vehicles Using Cameras and LiDARs


    Beteiligte:

    Erschienen in:

    Erscheinungsdatum :

    2019-12-01


    Format / Umfang :

    3057907 byte




    Medientyp :

    Aufsatz (Zeitschrift)


    Format :

    Elektronische Ressource


    Sprache :

    Englisch



    Online Intelligent Calibration of Cameras and LiDARs for Autonomous Driving Systems*

    Xu, Hanbo / Lan, Gongjin / Wu, Shaoguan et al. | IEEE | 2019


    Self-Calibration of Multiple LiDARs for Autonomous Vehicles

    Zhang, Zherui / Fu, Chen / Dong, Chiyu et al. | IEEE | 2021


    Multi-task near-field perception for autonomous driving using surround-view fisheye cameras

    Ravi Kumar, Varun / Technische Universität Ilmenau | TIBKAT | 2021

    Freier Zugriff

    Multi-Object Tracking For Autonomous Vehicles

    RADHA HAYDER / PANG SU | Europäisches Patentamt | 2022

    Freier Zugriff

    Multi-object tracking for autonomous vehicles

    RADHA HAYDER / PANG SU | Europäisches Patentamt | 2024

    Freier Zugriff