We present a novel approach for visual tracking of structured behaviour as observed in human-computer interaction. An automatically acquired variable-length Markov model is used to represent the high-level structure and temporal ordering of gestures. Continuous estimation of hand posture is handled by combining the model with annealed particle filtering. The stochastic simulation updates, and automatically switches between, different model representations of hand posture that correspond to distinct gestures. The implementation executes in real time and demonstrates significant improvement in robustness over comparable methods. We provide a measurement of user performance when our method is applied to a Fitts’ law drag-anddrop task, and an analysis of the effects of latency that it introduces.
Real-time Hand Tracking With Variable-Length Markov Models of Behaviour
2005-01-01
355423 byte
Conference paper
Electronic Resource
English
A real-time hand tracker using variable-length Markov models of behaviour
British Library Online Contents | 2007
|Learning Structured Behaviour Models using Variable Length Markov Models
British Library Conference Proceedings | 1999
|Hand gesture recognition using a real-time tracking method and hidden Markov models
British Library Online Contents | 2003
|Learning Variable-Length Markov Models of Behavior
British Library Online Contents | 2001
|