Facial image processing methods and systems

Data processing: structural design – modeling – simulation – and em – Modeling by mathematical expression

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C382S118000

Reexamination Certificate

active

07124066

ABSTRACT:
In the described embodiment, methods and systems for processing facial image data for use in animation are described. In one embodiment, a system is provided that illuminates a face with illumination that is sufficient to enable the simultaneous capture of both structure data, e.g. a range or depth map, and reflectance properties, e.g. the diffuse reflectance of a subject's face. This captured information can then be used for various facial animation operations, among which are included expression recognition and expression transformation.

REFERENCES:
patent: 5631976 (1997-05-01), Bolle et al.
patent: 5793879 (1998-08-01), Benn et al.
patent: 6173068 (2001-01-01), Prokoski
patent: 6341878 (2002-01-01), Chiang
patent: 6400835 (2002-06-01), Lemelson et al.
patent: 6593925 (2003-07-01), Hakura et al.
patent: 6850872 (2005-02-01), Marschner et al.
Keith Waters, “A Muscle Model For Animating Three-Dimensional Facial Expression”, In Maureen C. Stone, editor, Computer Graphics (SIGGRAPH Proceedings 1987), vol. 21, pp. 17-24, Jul. 1987.
Debevec et al., “Acquiring the Reflectance Field of Human Face,” ACM Proceedings of the 27th Annual Conference on Computer Graphics and Interactive Techniques, Jul. 2000, pp. 145-156.
Backman et al., “Polarized Light Scattering Spectroscopy for Quantitative Measurement of Epithelial Cellular Structures in Situ,” IEEE Journal on Selected Topics in Quantum Electronics, vol. 5 No. 4, Jul. 1999, pp. 1019-1026.
Yuille et al., “Shape and Albedo from Multiple Images Using Integrability,” 1997 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, Jun. 1997, pp. 158-164.
Aaron Lee et al., “Displaced Subdivision Surfaces”, Submitted for Publication, Date unknown.
Volker Blanz et al., “A Morphable model for the Synthesis of 3d Faces”, Proceedings of SIGGRAPH 1999, pp. 187-194, 1999.
Brian Gunter et al., “Making Faces”, Proceedings of SIGGRAPH 1998, pp. 55-67, 1998.
Pat Hanrahan et al., “Reflection For Layered Surfaces Due to Subsurface Scattering”, Proceedings of SIGGRAPH 1993, pp. 165-174, 1993.
Hughes Hoppe et al., “Piecewise Smooth Surface Reconstruction”, In Computer Graphics (SIGGRAPH 1994 Proceedings) pp. 295-302, Jul. 1994.
Yizhou Yu, “Inverse Global Illumination: Recovering Reflectance Models of Real Scenes From Photographs”, In Computer Graphics (SIGGRAPH 1999 Proceedings), pp. 215-224, Aug. 1999.
Eric P.F. Lafortune et al., “Non-linear Approximation of Reflection Functions”, In Computer Graphics (SIGGRAPH 1997 Proceedings) pp. 117-126, Aug. 1997.
Yuencheng Lee et al., “Realistic Modeling For Facial Animation”, Computer Graphics, 29(2), pp. 55-62, Jul. 1995.
Charles Loop, “Smooth Subdivision Surfaces Based on Triangles” PhD Thesis, University of Utah, Aug. 1987.
Stephen R. Marschner et al., “Image-based BRDF Measurement Including Human Skin”, In Rendering Techniques 1999, (Proceedings of the Eurographic Workshop on Rendering), pp. 131-144, Jun. 1999.
Frederic Pighin et al., “Synthesizing Realistic Facial Expressions From Photographs”, In Computer Graphics (SIGGRAPH 1998 Proceedings), pp. 78-84, Jul. 1998.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Facial image processing methods and systems does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Facial image processing methods and systems, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Facial image processing methods and systems will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-3700554

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.