3D face reconstruction from 2D images

Image analysis – Applications – Personnel identification

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C382S154000, C382S294000

Reexamination Certificate

active

07856125

ABSTRACT:
A 3D face reconstruction technique using 2D images, such as photographs of a face, is described. Prior face knowledge or a generic face is used to extract sparse 3D information from the images and to identify image pairs. Bundle adjustment is carried out to determine more accurate 3D camera positions, image pairs are rectified, and dense 3D face information is extracted without using the prior face knowledge. Outliers are removed, e.g., by using tensor voting. A 3D surface is extracted from the dense 3D information and surface detail is extracted from the images.

REFERENCES:
patent: 4710873 (1987-12-01), Breslow et al.
patent: 5327521 (1994-07-01), Savic et al.
patent: 5821943 (1998-10-01), Shashua
patent: 6141060 (2000-10-01), Honey et al.
patent: 6283858 (2001-09-01), Hayes, Jr. et al.
patent: 6313835 (2001-11-01), Gever et al.
patent: 6331861 (2001-12-01), Gever et al.
patent: 6350199 (2002-02-01), Williams et al.
patent: 6425825 (2002-07-01), Sitrick
patent: 6492990 (2002-12-01), Peleg et al.
patent: 6496598 (2002-12-01), Harman
patent: 6539354 (2003-03-01), Sutton et al.
patent: 6556196 (2003-04-01), Blanz et al.
patent: 6559845 (2003-05-01), Harvill et al.
patent: 6807290 (2004-10-01), Liu et al.
patent: 6816159 (2004-11-01), Solazzi
patent: 6894686 (2005-05-01), Stamper et al.
patent: 6919892 (2005-07-01), Cheiky et al.
patent: 6954498 (2005-10-01), Lipton
patent: 6999073 (2006-02-01), Zwern et al.
patent: 7003134 (2006-02-01), Covell et al.
patent: 7016824 (2006-03-01), Waupotitsch et al.
patent: 7027054 (2006-04-01), Cheiky et al.
patent: 7103211 (2006-09-01), Medioni et al.
patent: 7123263 (2006-10-01), Harvill
patent: 7137892 (2006-11-01), Sitrick
patent: 7184071 (2007-02-01), Chellappa et al.
patent: 7224357 (2007-05-01), Chen et al.
patent: 7285047 (2007-10-01), Gelb et al.
patent: 7355607 (2008-04-01), Harvill
patent: 7415152 (2008-08-01), Jiang et al.
patent: 7697787 (2010-04-01), Illsley
patent: 2002/0024516 (2002-02-01), Chen et al.
patent: 2003/0007700 (2003-01-01), Gutta et al.
patent: 2004/0051783 (2004-03-01), Chellappa et al.
patent: 2004/0070585 (2004-04-01), Papiernik et al.
patent: 2004/0208344 (2004-10-01), Liu et al.
patent: 2004/0223630 (2004-11-01), Waupotitsch et al.
patent: 2004/0223631 (2004-11-01), Waupotitsch et al.
patent: 2005/0111705 (2005-05-01), Waupotitsch et al.
patent: 2005/0162419 (2005-07-01), Kim et al.
patent: 2005/0226509 (2005-10-01), Maurer et al.
patent: 2005/0265583 (2005-12-01), Covell et al.
patent: 2006/0067573 (2006-03-01), Parr et al.
patent: 2006/0126924 (2006-06-01), Liu et al.
patent: 2008/0063263 (2008-03-01), Zhang et al.
patent: 2008/0152200 (2008-06-01), Medioni et al.
patent: 2008/0152213 (2008-06-01), Medioni et al.
patent: WO 01/63560 (2001-08-01), None
patent: WO 0163560 (2001-08-01), None
Faysal et al, Estimating 3D camera motion—structure, Apr. 26, 2002, Elsevier, S0167-8655(02)00246-5, pattern recog 24(2003) 327-337, pp. 327-337.
Estimating 3D camera motion—Structure, Faysal et al. ,Elsevier, S0167-8655(02)00246-5, pp. 327-337.
3D reconstruction—scanner, Sun et al, IEEE, 1051-4651/02, 2002, pp. 653-656.
Epipolar geometry—voting, Tong et al, IEEE, 0-7695-1272-0/01, 2001, pp. I-926-I-933.
Blanz, V., et al., “A Morphable Model for the Synthesis of 3D Faces”,SIGGRAPH 1999, Proceedings of the 26th annual conference on Computer graphics and interactive techniques, pp. 187-194, Jul. 1999.
Chowdhury, A., et al., “Face Reconstruction from Monocular Video Using Uncertainty Analysis and a Generic Model”,Computer Vision and Image Understanding, 91(1-2):188-213, Jul.-Aug. 2003.
DeCarlo, D., et al., “The Integration of Optical Flow and Deformable Models with Applications to Human Face Shape and Motion Estimation”,Proc. CVPR'96, pp. 231-238,1996.
Fidaleo, D., et al., “An Investigation of Model Bias in 3D Face Tracking”,ICCV International Workshop on Analysis and Modeling of Faces and Gestures, Beijing, pp. 1-15, Oct. 2005.
Fua, P., “Using Model-Driven Bundle-Adjustment to Model Heads From Raw Video Sequences”,Proc. 7thInternational Conference on Computer Vision, vol. 1, pp. 46-53, Sep. 1999.
Ilic, S., et al., “Implicit Meshes for Surface Reconstruction”,IEEE Transactions on Pattern Analysis and Machine Intelligence, 28(2):328-333, Feb. 2006.
Medioni, G., et al., “Generation of a 3D Face Model from One Camera”,Proc. IEEE Computer Society 16th International Conference on Pattern Recognition, vol. 3, pp. 667-671, Aug. 2002.
Medioni, G., et al., “Tensor Voting: Theory and Applications”,12eme Congres Francophone AFRIF-AFIS . . .(RFIA), 10 pages, Feb. 2000.
Pollefeys, M., et al., “Visual Modeling with a Hand-Held Camera”,International Journal of Computer Vision, 59(3):207-232, Sep. 2004.
Romdhani, S., et al., “Efficient, Robust and Accurate Fitting of a 3D Morphable Model”,Proc. Ninth IEEE International Conference on Computer Vision(ICCV'03), vol. 1, pp. 59-66, Oct. 2003.
Shan, Y., et al., “Model-Based Bundle Adjustment with Application to Face Modeling”,Proc. Eighth IEEE International Conference(ICCV'01), vol. 2, pp. 644-651, Jul. 2001.
International Search Report and Written Opinion, International Application No. PCT /US 07/02786, mailed Apr. 17, 2008, 7 pages.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

3D face reconstruction from 2D images does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with 3D face reconstruction from 2D images, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and 3D face reconstruction from 2D images will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-4187776

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.