This work addresses the problem of semantic scene understanding under foggy road conditions. Although marked progress has been made in semantic scene understanding over the recent years, it is mainly concentrated on clear weather outdoor scenes. Extending semantic segmentation methods to adverse weather conditions like fog is crucially important for outdoor applications such as self-driving cars. In this paper, we propose a novel method, which uses purely synthetic data to improve the performance on unseen real-world foggy scenes captured in the streets of Zurich and its surroundings. Our results highlight the potential and power of photo-realistic synthetic images for training and especially fine-tuning deep neural nets. Our contributions are threefold, 1) we created a purely synthetic, high-quality foggy dataset of 25,000 unique outdoor scenes, that we call Foggy Synscapes and plan to release publicly 2) we show that with this data we outperform previous approaches on real-world foggy test data 3) we show that a combination of our data and previously used data can even further improve the performance on real-world foggy data.


    Access

    Check access

    Check availability in my library

    Order at Subito €


    Export, share and cite



    Title :

    Semantic Understanding of Foggy Scenes with Purely Synthetic Data


    Contributors:


    Publication date :

    2019-10-01


    Size :

    3012850 byte




    Type of media :

    Conference paper


    Type of material :

    Electronic Resource


    Language :

    English



    Visibility Enhancement for Roads with Foggy or Hazy Scenes

    Tan, R.T. / Pettersson, N. / Petersson, L. et al. | British Library Conference Proceedings | 2007


    Visibility Enhancement for Roads with Foggy or Hazy Scenes

    Tan, Robby T. / Pettersson, Niklas / Petersson, Lars | IEEE | 2007



    Foggy forecast

    Cotey, Angela | IuD Bahn | 2007


    Object Detection in Foggy Scenes by Embedding Depth and Reconstruction into Domain Adaptation

    Yang, Xin / Mi, Michael Bi / Yuan, Yuan et al. | British Library Conference Proceedings | 2023