Method for processing video pictures for false contours and...

Computer graphics processing and selective visual display system – Computer graphics processing – Attributes

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

Reexamination Certificate

active

10958514

ABSTRACT:
The present invention relates to a method and an apparatus for processing video pictures especially for dynamic false contour effect and dithering noise compensation. The main idea of this invention is to divide the picture to be displayed in areas of at least two types, for example low video gradient areas and high video gradient areas, to allocate a different set of GCC (for Gravity Center Coding) code words to each type of area, the set allocated to a type of area being dedicated to reduce false contours and dithering noise in the area of this type, and to encode the video levels of each area of the picture to be displayed with the allocated set of GCC code words. In this manner, the reduction of false contour effects and dithering noise in the picture is optimized area by area.

REFERENCES:
patent: 5598482 (1997-01-01), Balasubramanian
patent: 2003/0164961 (2003-09-01), Daly
patent: EP 0978816 (2000-02-01), None
patent: 1 256 924 (2002-11-01), None
patent: 1 262 942 (2002-12-01), None

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Method for processing video pictures for false contours and... does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Method for processing video pictures for false contours and..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Method for processing video pictures for false contours and... will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-3816951

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.