The drivers activities and the resulting distraction is relevant for all levels of vehicle automation. It is especially important for take-over scenarios in partially automated vehicles. To this end we investigate graph neuronal networks for pose based driver activity recognition. We focus on integrating additional input modalities like interior elements and objects and investigate how this data can be integrated in an activity recognition model. We test our approach on the Drive & Act dataset [1]. To this end we densely annotate and publish the bounding boxes of the dynamic objects contained in the dataset. Our results show that adding the additional input modalities boosts the recognition results of classes related to interior elements and objects by a large margin closing the gap to popular image based methods.


    Zugriff

    Zugriff prüfen

    Verfügbarkeit in meiner Bibliothek prüfen

    Bestellung bei Subito €


    Exportieren, teilen und zitieren



    Titel :

    Dynamic Interaction Graphs for Driver Activity Recognition


    Beteiligte:


    Erscheinungsdatum :

    20.09.2020


    Format / Umfang :

    833934 byte




    Medientyp :

    Aufsatz (Konferenz)


    Format :

    Elektronische Ressource


    Sprache :

    Englisch



    Action and Object Interaction Recognition for Driver Activity Classification

    Weyers, Patrick / Schiebener, David / Kummert, Anton | IEEE | 2019


    OPEN SET DRIVER ACTIVITY RECOGNITION

    Roitberg, Alina / Ma, Chaoxiang / Haurilet, Monica et al. | British Library Conference Proceedings | 2020


    Open Set Driver Activity Recognition

    Roitberg, Alina / Ma, Chaoxiang / Haurilet, Monica et al. | IEEE | 2020


    DRIVER STATE RECOGNITION APPARATUS, DRIVER STATE RECOGNITION SYSTEM, AND DRIVER STATE RECOGNITION METHOD

    YABUUCHI TOMOHIRO / AIZAWA TOMOYOSHI / HYUGA TADASHI et al. | Europäisches Patentamt | 2019

    Freier Zugriff

    DRIVER STATE RECOGNITION APPARATUS, DRIVER STATE RECOGNITION SYSTEM, AND DRIVER STATE RECOGNITION METHOD

    AOI HATSUMI / AIZAWA TOMOYOSHI / HYUGA TADASHI et al. | Europäisches Patentamt | 2019

    Freier Zugriff