A 3D segment tracking approach to recognition of human pose and gestures is presented. The author has previously developed and refined a stereo based method, called the proximity space method, for acquiring and maintaining the track of object surfaces in 3-space. This method uses LoG filtered images and relies solely on stereo measurement to spatially distinguish between objects in 3D. The objective of the work is to obtain useful state information about the shape, size, and pose of natural (unadorned) objects in their naturally cluttered environments. Thus, the system does not require or benefit from special markers, colors, or other tailored artifacts. Recently he has extended this method in order to track multiple regions and segments of complex objects. The paper describes techniques for applying the proximity space method to a particularly interesting system: the human. Specifically, he discusses the use of simple models for constraining proximity space behavior in order to track gestures as a person moves through a cluttered environment. It is demonstrated that by observing the behavior of the model, used to tract the human's pose through time, different gestures can be easily recognized. The approach is illustrated through a discussion of gestures used to provide logical and spatial commands to a mobile robot.
3-D real-time gesture recognition using proximity spaces
1996-01-01
624254 byte
Conference paper
Electronic Resource
English
3-D Real-Time Gesture Recognition Using Proximity Spaces
British Library Conference Proceedings | 1996
|Real-time gesture recognition system and application
British Library Online Contents | 2002
|Real-Time Hand Gesture Recognition for Robot Hand Interface
British Library Conference Proceedings | 2014
|