Color image reading apparatus having shading correction for plur

Facsimile and static presentation processing – Static presentation processing – Attribute control

Patent

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

358461, H04N 146

Patent

active

050776053

ABSTRACT:
A color image reading apparatus having a color sensor with plural sensor groups, the color sensor for separating an objective image into plural color components and for converting the plural color components into plural color component signals, and a shading corrector for performing shading correction for the plural color component signals. The shading corrector causes the plural sensor groups to substantially simultaneously read a reference plate and automatically corrects a shading correction state for each of the plural color component signals in accordance with data obtained from the plural sensor groups.

REFERENCES:
patent: 3194882 (1965-07-01), Hall
patent: 4058828 (1977-11-01), Ladd
patent: 4129853 (1978-12-01), Althauser et al.
patent: 4491963 (1985-01-01), Bellemare
patent: 4520395 (1985-05-01), Abe
patent: 4523229 (1985-06-01), Kanmoto
patent: 4524338 (1985-06-01), Abe et al.
patent: 4554583 (1985-11-01), Saitoh et al.
patent: 4695884 (1987-09-01), Anastassiou et al.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Color image reading apparatus having shading correction for plur does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Color image reading apparatus having shading correction for plur, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Color image reading apparatus having shading correction for plur will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-1513628

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.