Method for defining chromaticity regions according to...

Image analysis – Color image processing – Color correction

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C345S589000, C358S518000

Reexamination Certificate

active

07447355

ABSTRACT:
Image processing includes defining chromaticity regions according to different luminance levels, and when the luminance of a pixel corresponds to one of the luminance levels, and a chromaticity vector of the pixel is within a chromaticity region corresponding to the luminance level, adjusting the chromaticity vector of the pixel.

REFERENCES:
patent: 6011595 (2000-01-01), Henderson et al.
patent: 6823083 (2004-11-01), Watanabe et al.
patent: 2003/0234756 (2003-12-01), Ku et al.
patent: 2004/0071343 (2004-04-01), Yamazoe et al.
patent: 2004/0165772 (2004-08-01), Russell et al.
patent: 2005/0044371 (2005-02-01), Braudaway et al.
patent: 2005/0185839 (2005-08-01), Matsubara
patent: 2005/0190205 (2005-09-01), Koyama
Wong et al. (“An Efficient Color Compensation Scheme for Skin Color Segmentation,” IEEE Int'l Symp. Circuits and Systems, 2003, pp. 676-679; submitted as part of the IDS).
Kwok-Wai Wong et al., An Efficient Color Compensation Scheme for Skin Color Segmentation, ISCAS '03 (International Symposium on Circuits and Systems, 2003.), pp. 676-679, 2003 IEEE.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Method for defining chromaticity regions according to... does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Method for defining chromaticity regions according to..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Method for defining chromaticity regions according to... will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-4023296

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.