Face modeling and editing with statistical local feature control models

Yu Zhang, Norman I. Badler
2007 International journal of imaging systems and technology (Print)  
This article presents a novel method based on statistical facial feature control models for generating realistic controllable face models. The local feature control models are constructed based on the exemplar 3D face scans. We use a three-step model fitting approach for the 3D registration problem. Once we have a common surface representation for examples, we form feature shape spaces by applying a principal component analysis (PCA) to the data sets of facial feature shapes. We compute a set
more » ... anthropometric measurements to parameterize the exemplar shapes of each facial feature in a measurement space. Using PCA coefficients as a compact shape representation, we approach the shape synthesis problem by forming scattered data interpolation functions that are devoted to the generation of desired shape by taking the anthropometric parameters as input. The correspondence among all exemplar face textures is obtained by parameterizing a 3D generic mesh over a 2D image domain. The new feature texture with desired attributes is synthesized by interpolating the exemplar textures. With the exception of an initial tuning of feature point positions and assignment of texture attribute values, our method is fully automated. In the resulting system, users are assisted in automatically generating or editing a face model by controlling the high-level parameters. Abstract This paper presents a novel method based on statistical facial feature control models for generating realistic controllable face models. The local feature control models are constructed based on the exemplar 3D face scans. We use a three-step model fitting approach for the 3D registration problem. Once we have a common surface representation for examples, we form feature shape spaces by applying a principal component analysis (PCA) to the data sets of facial feature shapes. We compute a set of anthropometric measurements to parameterize the exemplar shapes of each facial feature in a measurement space. Using PCA coefficients as a compact shape representation, we approach the shape synthesis problem by forming scattered data interpolation functions that are devoted to the generation of desired shape by taking the anthropometric parameters as input. The correspondence among all exemplar face textures is obtained by parameterizing a 3D generic mesh over a 2D image domain. The new feature texture with desired attributes is synthesized by interpolating the exemplar textures. With the exception of an initial tuning of feature point positions and assignment of texture attribute values, our method is fully automated. In the resulting system, users are assisted in automatically generating or editing a face model by controlling the high-level parameters. control over the result, including the ability to produce cartoon effects and the high efficiency of geometric manipulation. However, it requires a great deal of expertise to avoid unrealistic results. The reconstructive approach is to extract face geometry from the measurement of a living subject. The most accurate method is to make a plaster model of a face and to scan this model using a precise laser range system. However, not everybody can afford the considerable time and expense this process requires. In addition, the molding compound may lead to sagging of facial features. With a significant increase in the quality and availability of 3D capture methods, a common approach towards creating face models of real humans uses laser range scanners (e.g., CyberWare) to acquire both the face geometry and texture simultaneously [41, 42, 53] . Although the acquired face data is highly accurate, the scanned model corresponds to a single individual that tells us little about the spaces of face shapes and textures; and each new face must be found on a subject. The user does not has any control over the model to edit it in a way that produces a plausible, novel face. In this paper a new method is presented to bridge the creative approach and reconstructive approach. Our method estimates explicit high-level control models of human faces by examining geometry and color variations among captured face data. Since internal facial features (e.g. eyes, nose, mouth and chin) are good for discriminating faces [87], we construct a face synthesis system by perceiving the face as a set of feature regions. The feature-based synthesis allows us to generate more diverse faces through various combinations of synthesized features. The learned control models define a mapping from control parameters back to facial feature shape or texture, thus we can use them to automatically synthesize varied geometric models of human faces. As a result, our system provides an intuitive interface for users via controllable attributes of facial features. And because the control models are estimated based on the extracted statistics from the example models, the output model of the system maintains the quality that exists in the real faces of individuals. Our method takes as examples 3D face scans from a large database. We use a three-step model fitting approach for the 3D registration problem, where the fitting of a generic mesh to each example is carried out in a global-to-local fashion. By bringing scanned models into full correspondence with each other, we are able to apply a principal component analysis (PCA) to the exemplar shapes of each facial feature to build a lowdimensional shape space. We characterize and explore the space of probable feature shapes using high-level control parameters. We parameterize the example models using anthropometric measurements, and predefine the interpolation functions for parameterized example models based on radial basis functions. At runtime, the interpolation functions are evaluated to efficiently generate the appropriate feature shapes by taking the anthropometric parameters as input. We automatically determine correspondence among exemplar face textures by constructing a parameterization of the 3D generic mesh over a 2D image domain. Having the texture interpo-
doi:10.1002/ima.20127 fatcat:qd32t6wrfrh2tojjydnietm2ie