Recent road marking recognition has achieved great success in the past few years along with the rapid development of deep learning. Although considerable advances have been made, they are often over-dependent on unrepresentative datasets and constrained conditions. In this paper, to overcome these drawbacks, we propose an alternative method that achieves higher accuracy and generates high-quality samples as data augmentation. With the following two major contributions: 1) The proposed deblurring network can successfully recover a clean road marking from a blurred one by adopting generative adversarial networks (GAN). 2) The proposed data augmentation method, based on mutual information, can preserve and learn semantic context from the given dataset. We construct and train a class-conditional GAN to increase the size of training set, which makes it suitable to recognize target. The experimental results have shown that our proposed framework generates deblurred clean samples from blurry ones, and outperforms other methods even with unconstrained road marking datasets.


    Access

    Check access

    Check availability in my library

    Order at Subito €


    Export, share and cite



    Title :

    Unconstrained Road Marking Recognition with Generative Adversarial Networks


    Contributors:
    Lee, Younkwan (author) / Lee, Juhyun (author) / Hong, Yoojin (author) / Ko, YeongMin (author) / Jeon, Moongu (author)


    Publication date :

    2019-06-01


    Size :

    404641 byte





    Type of media :

    Conference paper


    Type of material :

    Electronic Resource


    Language :

    English



    ROAD MARKING RECOGNITION DEVICE AND ROAD MARKING RECOGNITION PROGRAM

    UEDA YUSUKE / KAWASAKI NAOTERU / KUMANO TOSHIYA et al. | European Patent Office | 2015

    Free access


    Road marking recognition device

    KINOSHITA TOSHIKI / ITO TAKUMA / NAKAMURA SATOSHI et al. | European Patent Office | 2020

    Free access

    ROAD MARKING RECOGNITION DEVICE

    KINOSHITA TOSHIKI / ITO TAKUMA / NAKAMURA SATOSHI et al. | European Patent Office | 2018

    Free access