Model-Based Eye Detection and Animation
Independent thesis Advanced level (degree of Magister), 20 points / 30 hpStudent thesis
In this thesis we present a system to extract the eye motion from a video stream containing a human face and applying this eye motion into a virtual character. By the notation eye motion estimation, we mean the information which describes the location of the eyes in each frame of the video stream. Applying this eye motion estimation into a virtual character, we achieve that the virtual face moves the eyes in the same way than the human face, synthesizing eye motion into a virtual character. In this study, a system capable of face tracking, eye detection and extraction, and finally iris position extraction using video stream containing a human face has been developed. Once an image containing a human face is extracted from the current frame of the video stream, the detection and extraction of the eyes is applied. The detection and extraction of the eyes is based on edge detection. Then the iris center is determined applying different image preprocessing and region segmentation using edge features on the eye picture extracted.
Once, we have extracted the eye motion, using MPEG-4 Facial Animation, this motion is translated into the Facial Animation arameters (FAPs). Thus we can improve the quality and quantity of Facial Animation expressions that we can synthesize into a virtual character.
Place, publisher, year, edition, pages
Institutionen för systemteknik , 2006. , 85 p.
Eye motion estimation, MPEG-4 Facial Animation, Feature Points, Facial Animation Parameters, Feature extraction, face tracking, eye tracking
Computer Vision and Robotics (Autonomous Systems)
IdentifiersURN: urn:nbn:se:liu:diva-7059ISRN: LiTH-ISY-EX--06/3909--SEOAI: oai:DiVA.org:liu-7059DiVA: diva2:22124
2006-06-21, Algotithm, B, 10:00