Image correction method

Image analysis – Image enhancement or restoration – Intensity – brightness – contrast – or shading correction

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C358S461000

Reexamination Certificate

active

10368498

ABSTRACT:
An image correction method is provided. First, scan an all-white document and form a scanned image including a plurality of image pixels with each of them having a gray level value. Next, gather these gray level values statistically such that each of them has an image pixel quantity. Then the maximum gray level value and the minimum gray level value are selected and a middle gray level value is obtained accordingly. Following that, determine whether the reference gray level value is greater or smaller than the middle gray level value according to the document, select the gray level values from within the interval of (the reference gray level value±a gray level value), and weight average the selected gray level values according to these selected gray level values and their corresponding image pixel quantities to obtain a corrected gray level value.

REFERENCES:
patent: 5469267 (1995-11-01), Wang
patent: 6075621 (2000-06-01), Takeuchi et al.
patent: 6674890 (2004-01-01), Maeda et al.
patent: 6791720 (2004-09-01), Hsieh
patent: 2001/0055415 (2001-12-01), Nozaki
patent: 2004/0047516 (2004-03-01), Tseng

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Image correction method does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Image correction method, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Image correction method will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-3748808

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.