In this thesis we present a system to extract the eye motion from a video stream containing a human face and applying this eye motion into a virtual character. By the notation eye motion estimation, we mean the information which describes the location of the eyes in each frame of the video stream. Applying this eye motion estimation into a virtual character, we achieve that the virtual face moves the eyes in the same way than the human face, synthesizing eye motion into a virtual character. In this study, a system capable of face tracking, eye detection and extraction, and finally iris position extraction using video stream containing a human face has been developed. Once an image containing a human face is extracted from the current frame of the video stream, the detection and extraction of the eyes is applied. The detection and extraction of the eyes is based on edge detection. Then the iris center is determined applying different image preprocessing and region segmentation using edge features on the eye picture extracted. Once, we have extracted the eye motion, using MPEG-4 Facial Animation, this motion is translated into the Facial Animation arameters (FAPs). Thus we can improve the quality and quantity of Facial Animation expressions that we can synthesize into a virtual character.


    Zugriff

    Download


    Exportieren, teilen und zitieren



    Titel :

    Model-Based Eye Detection and Animation


    Beteiligte:

    Erscheinungsdatum :

    2006-01-01


    Medientyp :

    Hochschulschrift


    Format :

    Elektronische Ressource


    Sprache :

    Englisch



    Klassifikation :

    DDC:    629




    Starling Animation

    Howard N Cannon / Scott T Miller / Julianna L Fishman | NTRS | 2022


    ANIMATION ARRANGEMENT

    IURASCU DANUT-PETRU | Europäisches Patentamt | 2020

    Freier Zugriff

    Animation arrangement

    IURASCU DANUT PETRU | Europäisches Patentamt | 2018

    Freier Zugriff

    Parking animation display method and parking animation display system

    CHEN SENLIN / ZHOU XIANG / YAN DAXING et al. | Europäisches Patentamt | 2023

    Freier Zugriff