Image processing method and apparatus

Facsimile and static presentation processing – Facsimile – Specific signal processing circuitry

Patent

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

382274, H04N 140

Patent

active

057575150

ABSTRACT:
An image reading apparatus can perform shading correction with a simple arrangement. Prior to an original read operation, a prescanning operation is performed using a white reference original. Shading data is formed by a shading correction circuit and held in both an internal shading memory and a memory in a shading data confirming unit. The shading data confirming unit compares the data in both the memories to detect whether the data is destroyed. If the data is destroyed, permanent shading data is used. When the image is a halftone image, the prescanning operation is performed again to form shading data. Therefore, a conventionally used prescanning mechanism for the reference white ground can be omitted. At the same time, even when the shading data is destroyed by an abnormal voltage or the like, processing can be normally performed.

REFERENCES:
patent: 5062144 (1991-10-01), Murakami
patent: 5091789 (1992-02-01), Haneda et al.
patent: 5253083 (1993-10-01), Hirota
patent: 5325210 (1994-06-01), Takashima et al.
patent: 5398119 (1995-03-01), Suzuki
patent: 5455690 (1995-10-01), Ishikawa

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Image processing method and apparatus does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Image processing method and apparatus, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Image processing method and apparatus will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-1968988

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.