Method for black trapping and under print processing

Facsimile and static presentation processing – Static presentation processing – Attribute control

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C358S518000, C358S003260

Reexamination Certificate

active

07009735

ABSTRACT:
A method for performing black trapping and under print processing on image data from a raster image processing (RIP) frame buffer is provided. The image data includes pixel data according to a predetermined color space (e.g., RGB) and may include black rendering hints. If black rendering hints are not included, the method includes steps to determine if each pixel from the frame buffer is black or black with respect to predetermined thresholds. Three further embodiments of the method are provided in regard to CMY under print values for a black pixel when all pixels of a context window surrounding the black pixel are black. One embodiment provides no CMY under print. Another embodiment provides a predefined CMY under print. Still another embodiment provides an adaptive CMY under print based on the last saved CMY under print value.

REFERENCES:
patent: 5313570 (1994-05-01), Dermer et al.
patent: 5923821 (1999-07-01), Birnbaum et al.
patent: 6594030 (2003-07-01), Ahlstrom et al.
patent: 6798540 (2004-09-01), Kritayakirana et al.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Method for black trapping and under print processing does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Method for black trapping and under print processing, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Method for black trapping and under print processing will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-3597411

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.