Method for reading images and apparatus therefor

Facsimile and static presentation processing – Facsimile – Picture signal generator

Patent

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

358474, 358480, 358487, 358505, 358506, 358509, H04N 104

Patent

active

060439076

ABSTRACT:
The density of cyan color of an image at the peak wavelength .lambda..sub.C1 in the spectral absorption distribution of a developed cyan color dye is detected in the density range of 0-2 as a detection range, and the density of the developed cyan color dye at the wavelength .lambda..sub.C2 deviated from the peak wavelength is detected in the density range of 1-1.75 as a detection range. The density equivalent to the density value detected at the peak wavelength .lambda..sub.C1 in the density range of 2-3.5 is obtained by doubling the density value detected in the density range of 1-1.75 as the detection range. By using the density value thus obtained and the density value detected in the range of 0-2, the cyan color density of the image is obtained in the density range of 0-3.5 as a detection range. Similarly, the density of each color of magenta and yellow is determined. By doing this, the color image can be read accurately by using an apparatus with simple structure.

REFERENCES:
patent: 5457007 (1995-10-01), Asami
patent: 5573894 (1996-11-01), Kodama et al.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Method for reading images and apparatus therefor does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Method for reading images and apparatus therefor, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Method for reading images and apparatus therefor will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-1330930

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.