Joint boosting feature selection for robust face recognition

Image analysis – Applications – Personnel identification

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C382S115000, C382S154000, C382S190000, C382S195000, C382S155000

Reexamination Certificate

active

07668346

ABSTRACT:
Methods and systems are provided for selecting features that will be used to recognize faces. Three-dimensional models are used to synthesize a database of virtual face images. The virtual face images cover wide appearance variations, different poses, different lighting conditions and expression changes. A joint boosting algorithm is used to identify discriminative features by selecting features from the plurality of virtual images such that the identified discriminative features are independent of the other images included in the database.

REFERENCES:
patent: 6002782 (1999-12-01), Dionysian
patent: 6381346 (2002-04-01), Eraslan
patent: 7024033 (2006-04-01), Li et al.
patent: 7221809 (2007-05-01), Geng
patent: 2005/0063566 (2005-03-01), Beek et al.
patent: 2005/0105794 (2005-05-01), Fung
patent: 2006/0233426 (2006-10-01), Mariani
Paul Viola, Michael J. Jones, “Robust Real-Time Object Detection”, Cambridge Research Laboratory, Cambridge, Massachusetts, Feb. 2001, pp. 1-25.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Joint boosting feature selection for robust face recognition does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Joint boosting feature selection for robust face recognition, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Joint boosting feature selection for robust face recognition will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-4219912

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.