Recognizing human actions is a core challenge for autonomous systems as they directly share the same space with humans. Systems must be able to recognize and assess human actions in real-time. To train the corresponding data-driven algorithms, a significant amount of annotated training data is required. We demonstrate a pipeline to detect humans, estimate their pose, track them over time and recognize their actions in real-time with standard monocular camera sensors. For action recognition, we transform noisy human pose estimates in an image like format we call Encoded Human Pose Image (EHPI). This encoded information can further be classified using standard methods from the computer vision community. With this simple procedure, we achieve competitive state-of-the-art performance in pose-based action detection and can ensure real-time performance. In addition, we show a use case in the context of autonomous driving to demonstrate how such a system can be trained to recognize human actions using simulation data.


    Access

    Check access

    Check availability in my library

    Order at Subito €


    Export, share and cite



    Title :

    Simple yet efficient real-time pose-based action recognition


    Contributors:


    Publication date :

    2019-10-01


    Size :

    2764142 byte




    Type of media :

    Conference paper


    Type of material :

    Electronic Resource


    Language :

    English



    Centroid based pose ratio for pedestrian action recognition

    Hariyono, Joko / Kang-Hyun Jo, | IEEE | 2016


    Real-Time Body Pose Recognition Using 2D or 3D Haarlets

    Bergh, M. / Koller-Meier, E. / Gool, L. | British Library Online Contents | 2009



    Coupled Action Recognition and Pose Estimation from Multiple Views

    Yao, A. / Gall, J. / Gool, L. | British Library Online Contents | 2012


    Real-Time Pose Graph SLAM based on Radar

    Holder, Martin / Hellwig, Sven / Winner, Hermann | IEEE | 2019