The drivers activities and the resulting distraction is relevant for all levels of vehicle automation. It is especially important for take-over scenarios in partially automated vehicles. To this end we investigate graph neuronal networks for pose based driver activity recognition. We focus on integrating additional input modalities like interior elements and objects and investigate how this data can be integrated in an activity recognition model. We test our approach on the Drive & Act dataset [1]. To this end we densely annotate and publish the bounding boxes of the dynamic objects contained in the dataset. Our results show that adding the additional input modalities boosts the recognition results of classes related to interior elements and objects by a large margin closing the gap to popular image based methods.


    Access

    Check access

    Check availability in my library

    Order at Subito €


    Export, share and cite



    Title :

    Dynamic Interaction Graphs for Driver Activity Recognition


    Contributors:


    Publication date :

    2020-09-20


    Size :

    833934 byte




    Type of media :

    Conference paper


    Type of material :

    Electronic Resource


    Language :

    English



    Action and Object Interaction Recognition for Driver Activity Classification

    Weyers, Patrick / Schiebener, David / Kummert, Anton | IEEE | 2019


    OPEN SET DRIVER ACTIVITY RECOGNITION

    Roitberg, Alina / Ma, Chaoxiang / Haurilet, Monica et al. | British Library Conference Proceedings | 2020


    Open Set Driver Activity Recognition

    Roitberg, Alina / Ma, Chaoxiang / Haurilet, Monica et al. | IEEE | 2020


    DRIVER STATE RECOGNITION APPARATUS, DRIVER STATE RECOGNITION SYSTEM, AND DRIVER STATE RECOGNITION METHOD

    YABUUCHI TOMOHIRO / AIZAWA TOMOYOSHI / HYUGA TADASHI et al. | European Patent Office | 2019

    Free access

    DRIVER STATE RECOGNITION APPARATUS, DRIVER STATE RECOGNITION SYSTEM, AND DRIVER STATE RECOGNITION METHOD

    AOI HATSUMI / AIZAWA TOMOYOSHI / HYUGA TADASHI et al. | European Patent Office | 2019

    Free access