Disney Human Face Project – capture and transfer of facial dynamics

Monday, October 5, 2009 - 11:19am


Friday October 9, 2009
Computer Science Conference Room, Harold Frank Hall Rm. 1132

HOST: Matthew Turk

Nokia Research Center Hollywood

Title: Disney Human Face Project – capture and transfer of facial dynamics


The Human Face Project” is a short film documenting an effort at Walt
Disney Feature Animation to track and animate human facial performance,
which was shown in the SIGGRAPH 2002 Electronic Theater. This
presentation will outline the techniques developed in this project, and
demonstrated in that film. The face tracking system we developed is an
example of model-based computer vision, and exploits the detailed
degrees of freedom of a geometric face model to confine the space of
solutions. Optical flow and/or successive rerendering of the model are
employed in an optimization loop to converge on model parameter
estimates. The structure of the model permits very principled mapping of
estimated expressions to different targets. Of critical importance for
media applications is the handling of details beyond the resolution or
degrees of freedom of the tracking model. We describe behavioral
modeling expedients for realizing these details in a plausible way.


Lance J. Williams is an Academy Award and Steven A. Coons Award winning
graphics researcher who made major contributions to texture map
prefiltering, shadowing algorithms, facial animation, and image-based
rendering. Prior to Nokia, Lance was a software engineer for Google
Earth, senior scientist at Applied Minds, Chief Scientist at Walt Disney
Feature Animation, senior software engineer at dreamWorks SKG, and
member of technical staff in Apple’s Advanced Technology Group, where he
contributed to QuickTime VR. He graduated from the University of Kansas
in 1972, and attended graduate school at the University of Utah. He was
awarded a Ph.D. in computer science from the University of Utah in 2000.