Image reading apparatus for grouping sensors according to...

Facsimile and static presentation processing – Natural color facsimile – Color correction

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C358S504000, C358S505000, C358S512000, C358S514000

Reexamination Certificate

active

06865000

ABSTRACT:
There is disclosed an image pickup device comprising plural picture elements, each including plural pixels arranged in the longitudinal direction of the image sensor, wherein the picture elements are so arranged that the separating area between the picture elements has a width larger than the distance between the centers of the pixels mutually adjacent within the picture element. Such arrangement enables high-quality image taking without generation of colored moiree fringes or false colors.

REFERENCES:
patent: 4558357 (1985-12-01), Nakagawa et al.
patent: 4763189 (1988-08-01), Komatsu et al.
patent: 4855817 (1989-08-01), Watanabe
patent: 4870483 (1989-09-01), Nishigaki et al.
patent: 5003380 (1991-03-01), Hirota
patent: 5329149 (1994-07-01), Kawahara et al.
patent: 5416611 (1995-05-01), Tandon
patent: 5428463 (1995-06-01), Goto
patent: 5452001 (1995-09-01), Hosier et al.
patent: 5477345 (1995-12-01), Tse
patent: 5587814 (1996-12-01), Mihara et al.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Image reading apparatus for grouping sensors according to... does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Image reading apparatus for grouping sensors according to..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Image reading apparatus for grouping sensors according to... will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-3395617

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.