Precise scene understanding based on perception sensors' data is important for assisted and automated driving (AAD) functions, to enable accurate decision-making processes and safe navigation. Among various perception tasks using camera images (e.g. object detection, semantic segmentation), panoptic segmentation shows promising scene understanding capability in terms of recognizing and classifying different types and objects, imminent obstacles, and drivable space at a pixel level. While current panoptic segmentation methods exhibit good potential for AAD perception under ‘ideal’ conditions, there are no systematic studies investigating the effects that various degradation factors can have on the quality of the data generated by automotive cameras. Therefore, in this paper, we consider 5 categories of camera data degradation models, namely light level, adverse weather, internal sensor noises, motion blur and compression artefacts. These 5 categories include 11 potential degradation models, with different degradation levels. Based on these 11 models and multiple degradation levels, we synthesize an augmented version of Cityscape, named the Degraded-Cityscapes (D-Cityscapes). Moreover, for the environmental light level, we propose a new synthetic method with generative adversarial learning and zero-reference deep curve estimation to simulate 3 degraded light levels including low light, night light, and extreme light. To compare the effect of the implemented camera degradation factors, we run extensive tests using a panoptic segmentation network (i.e. EfficientPS), quantifying how the performance metrics vary when the data are degraded. Based on the evaluation results, we demonstrate that extreme snow, blur, and light are the most threatening conditions for panoptic segmentation in AAD, while EfficientPS can cope well with light fog, compression, and blur, which can provide insights for future research directions.


    Zugriff

    Zugriff prüfen

    Verfügbarkeit in meiner Bibliothek prüfen

    Bestellung bei Subito €


    Exportieren, teilen und zitieren



    Titel :

    The Effect of Camera Data Degradation Factors on Panoptic Segmentation for Automated Driving


    Beteiligte:
    Wang, Yiting (Autor:in) / Zhao, Haonan (Autor:in) / Debattista, Kurt (Autor:in) / Donzella, Valentina (Autor:in)


    Erscheinungsdatum :

    2023-09-24


    Format / Umfang :

    657485 byte





    Medientyp :

    Aufsatz (Konferenz)


    Format :

    Elektronische Ressource


    Sprache :

    Englisch



    Multi-task Network for Panoptic Segmentation in Automated Driving

    Petrovai, Andra / Nedevschi, Sergiu | IEEE | 2019



    Location-Guided LiDAR-Based Panoptic Segmentation for Autonomous Driving

    Xian, Guozeng / Ji, Changyun / Zhou, Lin et al. | IEEE | 2023


    INFRASTRUCTURE ANALYSIS USING PANOPTIC SEGMENTATION

    SCHULTER SAMUEL / GARG SPARSH | Europäisches Patentamt | 2023

    Freier Zugriff

    Panoptic Based Camera and Lidar Fusion for Distance Estimation in Autonomous Driving Vehicles

    Jose, Edwin / P, Aparna M / Patil, Mrinalini et al. | British Library Conference Proceedings | 2022