Method and apparatus for image processing capable of...

Facsimile and static presentation processing – Static presentation processing – Attribute control

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C358S003150

Reexamination Certificate

active

10812964

ABSTRACT:
A novel image forming apparatus which makes gray-scale by performing at least one of operations including a manipulation of a plurality of dots arranged in a form of matrix, a single-dot-based density adjustment, or a single-dot-based size adjustment, includes a dot status detector and a density adjuster. The dot status detector detects an occurrence in which a dot exists at a focus dot position and no dot exists at positions immediately adjacent to the focus dot position in the main scanning direction. The density adjuster adjusts a writing level of the dot at the focus dot position so as to make the gray-scale smooth when the dot status detector detects the occurrence.

REFERENCES:
patent: 5999273 (1999-12-01), Casey et al.
patent: 6014499 (2000-01-01), Sasaki
patent: 6026184 (2000-02-01), Fukushima
patent: 6486973 (2002-11-01), Sasaki
patent: 01-218173 (1989-08-01), None
patent: 09-051434 (1997-02-01), None
patent: 09-275489 (1997-10-01), None
patent: 10-075367 (1998-03-01), None

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Method and apparatus for image processing capable of... does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Method and apparatus for image processing capable of..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Method and apparatus for image processing capable of... will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-3807176

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.