Video contrast adjusting method and system

Television – Image signal processing circuitry specific to television – Gray scale transformation

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C348S679000

Reexamination Certificate

active

07605872

ABSTRACT:
Methods and systems for adjusting color-saturation contrast of the video signal. Statistical information based on chrominance or luminance measures is analyzed to derive a gain function. The gain function is also determined according to the detection result of an exceptional situation during statistical analysis. The gain function is applied to the video signal to modify the amplitude of the chrominance signal.

REFERENCES:
patent: 4982287 (1991-01-01), Lagoni
patent: 5315389 (1994-05-01), Izawa et al.
patent: 5808697 (1998-09-01), Fujimura et al.
patent: 5822453 (1998-10-01), Lee et al.
patent: 5959696 (1999-09-01), Hwang
patent: 6049626 (2000-04-01), Kim
patent: 6463173 (2002-10-01), Tretter
patent: 6728416 (2004-04-01), Gallagher
patent: 6982704 (2006-01-01), Aoki et al.
patent: 7050114 (2006-05-01), Stessen et al.
patent: 7286716 (2007-10-01), Kim
patent: 2004/0213457 (2004-10-01), Mori
patent: 2005/0163372 (2005-07-01), Kida et al.
patent: 230551 (2005-04-01), None
TW Office Action mailed Mar. 9, 2009.
English abstract of TW230551.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Video contrast adjusting method and system does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Video contrast adjusting method and system, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Video contrast adjusting method and system will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-4118150

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.