System and method for automatically detecting and correcting...

Image analysis – Color image processing – Color correction

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

Reexamination Certificate

active

07155058

ABSTRACT:
A system and method automatically detects and corrects the occurrence of red-eye in digital photographs and images without user intervention. The system includes a face recognition and locator engine for locating human faces within the image and for generating a bounding box around each face. A red-eye locator engine analyzes the pixel data for each bounding box and computes one or more predefined metrics. The preferred metrics include color variation, redness, redness variation and glint. The red-eye locator engine also generates one or more detection masks based upon the computed metrics, and searches the detection mask for an occurrence of red-eye. A red-eye correction engine receives the detection mask including the detected occurrences of red-eye, and generates a correction mask. Pixels identified as being occurrences of red-eye are then de-saturated in accordance with the correction mask.

REFERENCES:
patent: 5432863 (1995-07-01), Benati et al.
patent: 5748764 (1998-05-01), Benati et al.
patent: 5900973 (1999-05-01), Marcellin-Dibon et al.
patent: 6009209 (1999-12-01), Acker et al.
patent: 6016354 (2000-01-01), Lin et al.
patent: 6172714 (2001-01-01), Ulichney
patent: 6278491 (2001-08-01), Wang et al.
patent: 6292574 (2001-09-01), Schildkraut et al.
patent: 6631208 (2003-10-01), Kinjo et al.
patent: 6728401 (2004-04-01), Hardeberg
Gaubatz et al., Automatic Red-Eye Detection and Correction, IEEE 0-7803-7622-6/02, pp. 1-8041-807.
U.S. Appl. No. 09/992,795 No. filed, Nov. 12, 2001, Michael J. Jones et al.
Crow, Franklin C., Summed-Area Tables for Texture Mapping, Computer Graphics, Jul. 1984, pp. 207-212, vol. 18, No. 13.
Xin, Z., Yanjun, Xu and Limin, DU, Locating facial features with color information, ICSP '98 Proceedings, vol. 2, (c) 1998, pp. 889-892.
Rowley, H., Baluja, S. and Kanade, T., Neutral Network-Based Face Detection, IEEE Transactions on Pattern Analysis and Machine Intelligence, Jan. 1998, pp. 23-38, vol. 20, No. 1.
Schneiderman, H. and Kanade, T., A Statistical Method for 3D Object Detection Applied to Faces and Cars, IEEE International Conference on Computer Vision, (c) 2000.
Zhang, L. and Lenders, P., Knowledge-Based Eye Detection for Human Face Recognition, Fourth International Conference on Knowledge-Based Intelligent Engineering Systems & Allied Technologies, 2000 Proceedings, vol. 1, (c) 2000, pp. 117-120.
Rizon, M. and Kawaguchi, Automatic Eye Detection Using Intensity and Edge Information, TENCON 2000 Proceedings, vol. 2, IEEE, (c) 2000, pp. 11-415-11-420.
Kawaguchi, T., Hidaka, D. and Rizon, M., Detection of Eyes from Human Faces y Hough Transform and Separability Filter, ICIP 2000 Proceedings, vol. 1, IEEE, (c) 2000, pp. 49-52.
Yang, M., Roth, D. and Ahuja, N., A SNoW-Based Face Detector, Neural Information Processing 12, (c) 2000.
McClellan, J., Parks, T. and Rabiner, L., A Computer Program for Designing Optimum FIR Linear Phase Digital Filters, IEEE Transactions on Audio and ElectroAcoustics, vol. AU-21, No. 6, Dec. 1973, pp. 506-526.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

System and method for automatically detecting and correcting... does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with System and method for automatically detecting and correcting..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and System and method for automatically detecting and correcting... will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-3714890

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.