Tracking articulated objects in image sequences remains a challenging problem, particularly in terms of the ability to localize the individual parts of an object given self-occlusions and changes in viewpoint. In this paper we propose a two-dimensional spatio-temporal modeling approach that handles both self-occlusions and changes in viewpoint. We use a Bayesian framework to combine pictorial structure spatial models with hidden Markov temporal models. Inference for these combined models can be performed using dynamic programming and sampling methods. We demonstrate the approach for the problem of tracking a walking person, using silhouette data taken from a single camera viewpoint. Walking provides both strong spatial (kinematic) and temporal (dynamic) constraints, enabling the method to track limb positions in spite of simultaneous self-occlusion and viewpoint change.


    Access

    Check access

    Check availability in my library

    Order at Subito €


    Export, share and cite



    Title :

    A unified spatio-temporal articulated model for tracking


    Contributors:


    Publication date :

    2004-01-01


    Size :

    575484 byte





    Type of media :

    Conference paper


    Type of material :

    Electronic Resource


    Language :

    English



    A Unified Spatio-Temporal Articulated Model for Tracking

    Lan, X. / Huttenlocher, D. / IEEE Computer Society | British Library Conference Proceedings | 2004


    A Unified Spatio-Temporal Description Model of Environment for Intelligent Vehicles

    Wang, Sijia / Jiang, Kun / Xie, Shichao et al. | Springer Verlag | 2020


    A Unified Spatio-Temporal Description Model of Environment for Intelligent Vehicles

    Wang, Sijia / Jiang, Kun / Xie, Shichao et al. | TIBKAT | 2021


    A Unified Spatio-Temporal Model for Short-Term Traffic Flow Prediction

    Duan, Peibo / Mao, Guoqiang / Liang, Weifa et al. | IEEE | 2019


    A Unified Spatio-Temporal Description Model of Environment for Intelligent Vehicles

    Wang, Sijia / Jiang, Kun / Xie, Shichao et al. | British Library Conference Proceedings | 2021