We have developed a fast robust 3D head tracking system based on rendering a texture-mapped cylinder. In order to handle the variable frame-to-frame motion changes, the system uses motion templates, which adapt to the current size of the motion increment. The relationship between measurable pixel energy and tracking error is used to design the parameters of the adaptive algorithm. To speedup processing and decouple rotational and translational motion, 2D positional information of the neckline is utilize. The confidence at each point is computed based on the amount of information used in creating the texture map and re-rendering the face. The system can also handle large out-of-plane rotations via additional templates. If the tracker fails, it can recover using an independent face detection routine. We compare the results of our approach with the extensive results of a closely related technique.


    Access

    Check access

    Check availability in my library

    Order at Subito €


    Export, share and cite



    Title :

    3D head tracking using motion adaptive texture-mapping


    Contributors:
    Brown, L.M. (author)


    Publication date :

    2001-01-01


    Size :

    627121 byte





    Type of media :

    Conference paper


    Type of material :

    Electronic Resource


    Language :

    English



    3D Head Tracking Using Motion Adaptive Texture-Mapping

    Brown, L. M. / IEEE | British Library Conference Proceedings | 2001


    Fast texture-based tracking and delineation using texture entropy

    Shahrokni, A. / Drummond, T. / Fua, P. | IEEE | 2005


    Fast Texture-Based Tracking and Delineation Using Texture Entropy

    Shahrokni, A. / Drummond, T. / Fua, P. et al. | British Library Conference Proceedings | 2005


    Colour, texture, and motion in level set based segmentation and tracking

    Brox, T. / Rousson, M. / Deriche, R. et al. | British Library Online Contents | 2010


    Multi-modal tracking using texture changes

    Kemp, C. / Drummond, T. | British Library Online Contents | 2008