Facial skin motion properties from video: Modeling and applications

2009 2009

Other formats: Order a copy

Abstract (summary)

Deformable modeling of facial soft tissues have found use in application domains such as human-machine interaction for facial expression recognition. More recently, such modeling techniques have been used for tasks like age estimation and person identification.

This dissertation is focused on development of novel image analysis algorithms to follow facial strain patterns observed through video recording of faces in expressions. Specifically, we use the strain pattern extracted from non-rigid facial motion as a simplified and adequate way to characterize the underlying material properties of facial soft tissues. Such an approach has several unique features. Strain pattern instead of the image intensity is used as a classification feature. Strain is related to biomechanical properties of facial tissues that are distinct for each individual. Strain pattern is less sensitive to illumination differences (between enrolled and query sequences) and face camouflage because the strain pattern of a face remains stable as long as reliable facial deformations are captured. A finite element modeling based method enforces regularization which mitigates issues (such as temporal matching and noise sensitivity) related to automatic motion estimation. Therefore, the computational strategy is accurate and robust. Images or videos of facial deformations are acquired with video camera and without special imaging equipment.

Experiments using range images on a dataset consisting of 50 subjects provide the necessary proof of concept that strain maps indeed have a discriminative value. On a video dataset containing 60 subjects undergoing a particular facial expression, experimental results using the computational strategy presented in this work emphasize the discriminatory and stability properties of strain maps across adverse data conditions (shadow lighting and face camouflage). Such properties make it a promising feature for image analysis tasks that can benefit from such auxiliary information about the human face. Strain maps add a new dimension in our abilities to characterize a human face. It also fosters newer ways to capture facial dynamics from video which, if exploited efficiently, can lead to an improved performance in tasks involving the human face.

In a subsequent effort, we model the material constants (Young's modulus) of the skin in sub-regions of the face from the motion observed in multiple facial expressions. On a public database consisting of 40 subjects undergoing some set of facial motions, we present an expression invariant strategy to matching faces using the Young's modulus of the skin. Such an efficient way of describing underlying material properties from the displacements observed in video has an important application in deformable modeling of physical objects which are usually gauged by their simplicity and adequacy.

The contributions through this work will have an impact on the broader vision community because of its highly novel approaches to the long-standing problem of motion analysis of elastic objects. In addition, the value is the cross disciplinary nature and its focus on applying image analysis algorithms to the rather difficult and important problem of material property characterization of facial soft tissues and their applications. We believe this research provides a special opportunity for the utilization of video processing to enhance our abilities to make unique discoveries through the facial dynamics inherent in video.

Indexing (details)

Computer science
0984: Computer science
Identifier / keyword
Applied sciences; Deformable modeling; Facial skin; Person identification; Strain patterns
Facial skin motion properties from video: Modeling and applications
Manohar, Vasant
Number of pages
Publication year
Degree date
School code
DAI-B 71/09, Dissertation Abstracts International
Place of publication
Ann Arbor
Country of publication
United States
Goldgof, Dmitry B.; Sarkar, Sudeep
University of South Florida
University location
United States -- Florida
Source type
Dissertations & Theses
Document type
Dissertation/thesis number
ProQuest document ID
Database copyright ProQuest LLC; ProQuest does not claim copyright in the individual underlying works.
Document URL
Access the complete full text

You can get the full text of this document if it is part of your institution's ProQuest subscription.

Try one of the following:

  • Connect to ProQuest through your library network and search for the document from there.
  • Request the document from your library.
  • Go to the ProQuest login page and enter a ProQuest or My Research username / password.