Method for tracking motion of a face

Image analysis – Applications – Target tracking or detecting

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C382S285000, C348S169000

Reexamination Certificate

active

07127081

ABSTRACT:
A method for tracking the motion of a person's face for the purpose of animating a 3-D face model of the same or another person is disclosed. The 3-D face model carries both the geometry (shape) and the texture (color) characteristics of the person's face. The shape of the face model is represented via a 3-D triangular mesh (geometry mesh), while the texture of the face model is represented via a 2-D composite image (texture image). Both the global motion and the local motion of the person's face are tracked. Global motion of the face involves the rotation and the translation of the face in 3-D. Local motion of the face involves the 3-D motion of the lips, eyebrows, etc., caused by speech and facial expressions. The 2-D positions of salient features of the person's face and/or markers placed on the person's face are automatically tracked in a time-sequence of 2-D images of the face. Global and local motion of the face are separately calculated using the tracked 2-D positions of the salient features or markers. Global motion is represented in a 2-D image by rotation and position vectors while local motion is represented by an action vector that specifies the amount of facial actions such as smiling-mouth, raised-eyebrows, etc.

REFERENCES:
patent: 4975960 (1990-12-01), Petajan
patent: 5280530 (1994-01-01), Trew et al.
patent: 5744953 (1998-04-01), Hansen
patent: 5774591 (1998-06-01), Black et al.
patent: 5802220 (1998-09-01), Black et al.
patent: 5805745 (1998-09-01), Graf
patent: 5807284 (1998-09-01), Foxlin
patent: 5907626 (1999-05-01), Toklu et al.
patent: 5923337 (1999-07-01), Yamamoto
patent: 5982909 (1999-11-01), Erdem et al.
patent: 6009210 (1999-12-01), Kang
patent: 6016148 (2000-01-01), Kang et al.
patent: 6020892 (2000-02-01), Dillon
patent: 6028960 (2000-02-01), Graf et al.
patent: 6031539 (2000-02-01), Kang et al.
patent: 6037949 (2000-03-01), DeRose et al.
patent: 6047078 (2000-04-01), Kang
patent: 6052132 (2000-04-01), Christian et al.
patent: 6064390 (2000-05-01), Sagar et al.
patent: 6175756 (2001-01-01), Ferre et al.
patent: 6204860 (2001-03-01), Singh
patent: 6272231 (2001-08-01), Maurer et al.
patent: 0 926 628 (1999-06-01), None
patent: 8293026 (1996-11-01), None
patent: WO 98 01830 (1998-01-01), None
patent: WO 9906962 (1999-02-01), None
patent: PCT/IB01/02735 (2002-06-01), None
patent: PCT/IB01/02736 (2002-07-01), None
patent: PCT/IB01/02363 (2002-08-01), None
Tomasi, Carlo, et al., “Shape and Motion From Image Streams Under Orthography :A Factorization Method”,International Journal of Computer Vision, vol. 9, No. 2, pp. 137-154, (1992).
Terzopoulos, et al., “Analysis and Synthesis of Facial Image Sequences Using Physical and Anatomical Models”,IEEE Transactions on Pattern Analysis and Machines Intelligence(PAMI), vol. 15, No. 6, pp. 569-579, (Jun. 1993).
Essa, Irfan A., et al., “A Vision System for Observing and Extracting Facial Action Parameters”,Proceedings of CVPR, Seattle, Washington, pp. 36-42 & 76-83, (Jun. 1994).
Smolic, Aljoscha, et al., “Real-Time Estimation of Long-Term 3-D Motion Parameters for SNHC Face Animation and Model-Based Coding Applications”,IEEE Transactions on Circuits and Systems for Video Technology, vol. 9, No. 2, pp. 255-263, (Mar. 1999).
Yokoya, Naokazu, et al., “Motion Tracking of Deformable Objects by Active Contour Models Using Multiscale Dynamic Programming”,Journal of Visual Communication and Image Representation, vol. 4, pp. 382-391, (Dec. 1993).
Bascle, Benedicte, et al., “Tracking Computer Primitive in An Image Sequence”,IEEE International Conference Pattern Recognition, Israel, pp. 426-431, (Oct. 1994).
Meyer, Francois G., et al., “Region-Based Tracking Using Affine Motion Modes in Long Image Sequences”,CVGIP: Image Understanding, vol. 60, pp. 119-140, (Sep. 1994).
Antoszczysyn, Paul M., et al., “Tracking of the Motion of Important Facial Features in Model-Based Coding”,Signal Processing, No. 66, pp. 249-260, (1998).
Petajan, Eric, Dr., “Very Low Bitrate Face Animation Coding in MPEG-4”,Lucent Technologies Technical Report, pp. 1-33.
Isard, Michael, et al., “Contour Tracking by Stochastic Propagation of Conditional Density”,Proceedings of European Conferene on Computer Vision, Cambridge, UK, pp. 343-356, (1996).
Bascle, B., et al., “Separability of Pose and Expression in Facial Tracking and Animation”,Proceedings of 6th International Conference on Computer Vision, pp. 323-328, (1998).
Blake, Andrew, et al., “3D-Position, Attitude, and Shape Input Using Video Tracking of Hands and Lips”,Computer Graphics(SIGGRAPH 1994), pp. 185-192, (1994).
Toyama, Kentaro, “Prolegomena For Robust Face Tracking”,Microsoft Research Technical Report, MSR-TR-98-65, pp. 1-15, (Nov. 13, 1998).
Terzopoulos, D., et al., “Tracking and Kalman Snakes”,Active Vision, (Ed. A. Blake and A. Yuille), pp. 3-20, (1992).
Li-An Tang et al: “Analysis-based facial expression synthesis” Proceedings of the International Conference of Image Processing (ICIP) Austin, Nov. 13-16, 1994 Los Alamitos, IEEE Comp. Soc. Press, US, vol. 3 Conf. 1, Nov. 13, 1994, pp. 98-102 XP010146456 ISBN: 0-8186-6952-7 the whole document.
Cosi P et al: “Phonetic recognition by recurrent neural networds working on audio and visual information” Speech Communication, Elsevier Science Publishers, Amsterdam, NL, vol. 19, No. 3, Sep. 1, 1996, pp. 245-252, XP004013654 ; ISSN: 0167-6393 p. 246, right hand column, line 13-p. 248, right-hand column, line 7.
Guenther, Brian; Grimm, Cindy; Wood, Daniel; Malvar, Henrique; Pighin Fredrick: “Making faces” Computer Graphics, Proceedings, Siggraph 98 Conference Proceedings, Proceedings of Siggraph 98: 25th International Conference on Computer Graphics and Interactive Techniques, Orlando, FL, USA, Jul. 19-24, 1998, pp. 55-66, XP002205956 1998, New York, NY, USA, ACM, USA ISBN: 0-89791-999-8 p. 56, paragraph 2-p. 59, paragraph 3.3.3.
Ebihara K et al: Real-Time 3-D Facial Image Reconstruction for Virtual Space Teleconferencing Electronics & Communications in Japan, Part 111-Fundamental Electronic Science, Scripta Technica, NY, US, vol. 82, No. 5, May 1999, pp. 80-90, XP000875659 ISSN:1042-0967 p. 80, left-hand column, line1 -p.81, rt-hand col. In 5.
Chen T: “Technologies for building networked collaborative environments” Image Processing, 1999, ICIP 99. Proceedings. 1999 International Conference on Kobe, Japan, Oct. 24-28, 1999, Piscataway, NJ, USA, IEEE, US, Oct. 24, 1999, pp. 16-20, XP010368845, ISBN: 0-7803-5467-2, p. 18, paragraph 2.3-p. 19, paragraph 2.4.
Lande C et al: “An MPEG-4 facial animation system driven by synthetic speech” Multimedia Modeling, 1998. MMM '98. Proceedings. 1998 Lausanne, Switzerland Oct. 12-15, 1998, Los Alamitos, CA. USA, IEEE Comput. Soc. US, Oct. 12, 1998, pp. 203-212, XP010309519, ISBN: 0-8186-8911-0, p. 1, left-hand column, line1-p. 207, right-hand column, line 30.
Lavagetto F et al: “The Facial Animation Engine: Toward a High-Level Interface for the Design of MPEG-4 Compliant Animated Faces” IEEE Transactions on Circuits and Systems for Video Technology, IEEE Inc. New York, US, vol. 9, No. 2, Mar. 1999, pp. 277-289, XP000805532, ISSN:1051-8215 p. 278, left-hand column, line11-line 22; p. 279, left-hand column, line 23-p. 282, left-hand column, line 39.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Method for tracking motion of a face does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Method for tracking motion of a face, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Method for tracking motion of a face will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-3633076

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.