In this paper we propose a fast fully convolutional neural network for panoptic segmentation that can provide an accurate semantic and instance-level representation of the environment in the 2D space. We tackle panoptic segmentation as a dense classification problem and generate masks for stuff classes as well as for each instance of things classes. Our network employs a shared backbone and Feature Pyramid Network for multi-scale feature extraction which we extend with dual-decoders that learn background and foreground specific masks. Guided by object proposals, the panoptic head assembles location-sensitive prototype masks using a learned weighting scheme. Our solution runs in real-time, in 82 ms on high resolution images, making it suitable for robotic applications and automated driving. Extensive experiments on the Cityscapes dataset demonstrate that our panoptic segmentation network is robust and accurate, with 57.3% PQ and 76.9% mIoU.


    Access

    Check access

    Check availability in my library

    Order at Subito €


    Export, share and cite



    Title :

    Real-Time Panoptic Segmentation with Prototype Masks for Automated Driving


    Contributors:


    Publication date :

    2020-10-19


    Size :

    1954233 byte





    Type of media :

    Conference paper


    Type of material :

    Electronic Resource


    Language :

    English



    Multi-task Network for Panoptic Segmentation in Automated Driving

    Petrovai, Andra / Nedevschi, Sergiu | IEEE | 2019


    The Effect of Camera Data Degradation Factors on Panoptic Segmentation for Automated Driving

    Wang, Yiting / Zhao, Haonan / Debattista, Kurt et al. | IEEE | 2023


    RT-K-Net: Revisiting K-Net for Real-Time Panoptic Segmentation

    Schon, Markus / Buchholz, Michael / Dietmayer, Klaus | IEEE | 2023


    Location-Guided LiDAR-Based Panoptic Segmentation for Autonomous Driving

    Xian, Guozeng / Ji, Changyun / Zhou, Lin et al. | IEEE | 2023


    INFRASTRUCTURE ANALYSIS USING PANOPTIC SEGMENTATION

    SCHULTER SAMUEL / GARG SPARSH | European Patent Office | 2023

    Free access