In this paper, we present a probabilistic tracking framework that combines sound and vision to achieve more robust and accurate tracking of multiple objects. In a cluttered or noisy scene, our measurements have a non-Gaussian, multi-modal distribution. We apply a particle filter to track multiple people using combined audio and video observations. We have applied our algorithm to the domain of tracking people with a stereo-based visual foreground detection algorithm and audio localization using a beamforming technique. Our model also accurately reflects the number of people present. We test the efficacy of our system on a sequence of multiple people moving and speaking in an indoor environment.
A Probabilistic Framework for Multi-modal Multi-Person Tracking
01.06.2003
681058 byte
Aufsatz (Konferenz)
Elektronische Ressource
Englisch
A multi-modal person perception framework for socially interactive mobile service robots
BASE | 2020
|SIC_DB: Multi-Modal Database for Person Authentication
British Library Conference Proceedings | 1999
|SIC DB: multi-modal database for person authentication
IEEE | 1999
|Tracking Humans using Multi-modal Fusion
IEEE | 2005
|