We describe a computer system that allows real-time tracking of facial expressions. Sparse, fast visual measurements using 2-D templates are used to observe the face of a subject. Rather than track features on the face, the distributed response of a set of templates is used to characterize a given facial region. These measurements are coupled via a linear interpolation method to states in a physically-based model of facial animation, which includes both skin and muscle dynamics. By integrating real-time 2D image-processing with 3-D models, we obtain a system that is able to quickly track and interpret complex facial motions.