Vehicular detection is an important research area in the field of target detection. The implementation of accurate vehicle recognition models for both day and night conditions will make future automatic vehicle technologies and driver assistance systems more reliable. Our research proposes a novel approach for night vehicle detection. The system is made up of two modules: one for image translation and one for vehicle detection. When a nighttime image is given, the picture translation module uses the Generative Adversarial as a basic network to produce daytime images as an output. Faster R-CNN and YOLOv5 algorithms are used during the vehicle detection phase to extract daytime features and identify vehicles. The YOLOv5 technology achieves a detection accuracy of 96.75%.


    Zugriff

    Zugriff prüfen

    Verfügbarkeit in meiner Bibliothek prüfen

    Bestellung bei Subito €


    Exportieren, teilen und zitieren



    Titel :

    Deep Learning Based Night Time Vehicle Detection for Autonomous Cars using Generative Adversarial Network


    Beteiligte:
    Nisha, U.N (Autor:in) / Ranjani, G (Autor:in)


    Erscheinungsdatum :

    2022-11-24


    Format / Umfang :

    1726650 byte




    Medientyp :

    Aufsatz (Konferenz)


    Format :

    Elektronische Ressource


    Sprache :

    Englisch



    Crack Detection Based on Generative Adversarial Networks and Deep Learning

    Chen, Gongfa / Teng, Shuai / Lin, Mansheng et al. | Springer Verlag | 2022


    Learn Travel Time Distribution with Graph Deep Learning and Generative Adversarial Network

    Song, Xiaozhuang / Zhang, Chenhan / Yu, James J.Q. | IEEE | 2021



    Infrared unmanned aerial vehicle detection based on generative adversarial network data augmentation

    Gao, Yuan / Luo, Zijuan / Yu, Xuelian et al. | British Library Conference Proceedings | 2021


    Anomaly Detection Using Convolutional Neural Network and Generative Adversarial Network

    Mohanan, Amritha / Padathil Veerendrakumar, Praveen / Padmanabha Rajeswari, Priyanka Pillai et al. | SAE Technical Papers | 2023