In this thesis we present a system to extract the eye motion from a video stream containing a human face and applying this eye motion into a virtual character. By the notation eye motion estimation, we mean the information which describes the location of the eyes in each frame of the video stream. Applying this eye motion estimation into a virtual character, we achieve that the virtual face moves the eyes in the same way than the human face, synthesizing eye motion into a virtual character. In this study, a system capable of face tracking, eye detection and extraction, and finally iris position extraction using video stream containing a human face has been developed. Once an image containing a human face is extracted from the current frame of the video stream, the detection and extraction of the eyes is applied. The detection and extraction of the eyes is based on edge detection. Then the iris center is determined applying different image preprocessing and region segmentation using edge features on the eye picture extracted. Once, we have extracted the eye motion, using MPEG-4 Facial Animation, this motion is translated into the Facial Animation arameters (FAPs). Thus we can improve the quality and quantity of Facial Animation expressions that we can synthesize into a virtual character.


    Access

    Download


    Export, share and cite



    Title :

    Model-Based Eye Detection and Animation


    Contributors:

    Publication date :

    2006-01-01


    Type of media :

    Theses


    Type of material :

    Electronic Resource


    Language :

    English



    Classification :

    DDC:    629




    Starling Animation

    Howard N Cannon / Scott T Miller / Julianna L Fishman | NTRS | 2022


    ANIMATION ARRANGEMENT

    IURASCU DANUT-PETRU | European Patent Office | 2020

    Free access

    Animation arrangement

    IURASCU DANUT PETRU | European Patent Office | 2018

    Free access

    Parking animation display method and parking animation display system

    CHEN SENLIN / ZHOU XIANG / YAN DAXING et al. | European Patent Office | 2023

    Free access