A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2017; you can also visit the original URL.
The file type is application/pdf
.
Tracking facial motion
Proceedings of 1994 IEEE Workshop on Motion of Non-rigid and Articulated Objects
We describe a computer system that allows real-time tracking of facial expressions. Sparse, fast visual measurements using 2-D templates are used to observe the face of a subject. Rather than track features on the face, the distributed response of a set of templates is used to characterize a given facial region. These measurements are coupled via a linear interpolation method to states in a physically-based model of facial animation, which includes both skin and muscle dynamics. By integrating
doi:10.1109/mnrao.1994.346257
fatcat:e4d6mootefctjnvfxmdaledrya