Image reading device and noise detecting method

Facsimile and static presentation processing – Static presentation processing – Attribute control

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C358S504000

Reexamination Certificate

active

07660018

ABSTRACT:
An image reading device compares pixel values of an image data, which is output when a document is not at a reading position, with a threshold and generates a first dust detection data; compares a image data, which is output while the document is passing the reading position, with a threshold and generates a second dust detection data; when dust is indicated as being present at a matching position in a main scanning direction in both the first dust detection data and the second dust detection data, generates a third dust detection data; and judges that noise is present when the third dust detection data indicates that dust is continuously present for at least a prescribed length in a sub scanning line direction.

REFERENCES:
patent: 5266805 (1993-11-01), Edgar
patent: 7257270 (2007-08-01), Yamaguchi
patent: 2002/0176634 (2002-11-01), Ohashi
patent: 2005/0179954 (2005-08-01), Arai et al.
patent: 2005/0200904 (2005-09-01), Prakash
patent: 2005/0213838 (2005-09-01), Kuramoto
patent: 2006/0268345 (2006-11-01), Silverstein
patent: A 2005-64913 (2005-03-01), None

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Image reading device and noise detecting method does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Image reading device and noise detecting method, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Image reading device and noise detecting method will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-4168520

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.