Z-buffer based interpenetrating object detection for...

Computer graphics processing and selective visual display system – Computer graphics processing – Three-dimension

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

Reexamination Certificate

active

06760025

ABSTRACT:

FIELD OF INVENTION
The invention relates to computer graphics, particularly to antialiasing technique and mechanism for three-dimensional object representations.
BACKGROUND OF INVENTION
In computer-based graphics display processing, antialiasing techniques are used generally to remove or reduce jagged edges from characters, lines and objects. Typically, antialiasing is applied to eliminate visibly jagged effects, particularly arising from diagonally drawn edges. In the case of graphics processing of three-dimensional (3D) objects, antialiasing also is used to de-jag edges of 3D objects.
However, in such 3D cases, antialiasing conventionally is applied either: to known object edges, typically edges of 3D primitives such as triangles, without regard to selecting or processing certain graphic content or conditions, such as when 3D object surfaces apparently interpenetrate or intersect in space; or to an entire object. In the case of antialiasing only known object edges, graphics contents or conditions, such as edges generated when 3D objects apparently interpenetrate or intersect in space, are not antialiased. In the case where anti-aliasing is applied for entire object, conventional antialiasing techniques for 3D object requires significant buffer or other hardware usage and results in decreased processing performance.
Accordingly, there is a need for an improved antialiasing scheme and system for processing interpenetrating 3D objects.
SUMMARY OF INVENTION
The invention resides in a graphic processing system which compares z-buffer values of 3D objects to detect interpenetration. Pixels corresponding to objects with substantially same z-buffer values are marked as interpenetrating in a tag buffer memory for antialiasing. The preferred antialiasing scheme may include over-sampling, area-based, blending, alpha edge or similar technique.
Thus, antialiasing of 3D object interpenetration in computer graphics improves performance and reduces hardware/software cost. Z-buffer construct or other storage facilitates interpenetration antialiasing, particularly through early detection. Preferably, over- or super-sampling antialiasing provides reduced sampling of select interpenetration elements, without processing entire image output or display signal. Performance and implementation is improved by processing only at edges and/or interpenetrations.


REFERENCES:
patent: 4590465 (1986-05-01), Fuchs
patent: 4783649 (1988-11-01), Fuchs et al.
patent: 4825391 (1989-04-01), Merz
patent: 4827445 (1989-05-01), Fuchs
patent: 5153937 (1992-10-01), Wobermin et al.
patent: 5509110 (1996-04-01), Latham
patent: 5561750 (1996-10-01), Lentz
patent: 5583974 (1996-12-01), Winner et al.
patent: 5594854 (1997-01-01), Baldwin et al.
patent: 5740345 (1998-04-01), Danielson et al.
patent: 5872902 (1999-02-01), Kuchkuda et al.
patent: 5977987 (1999-11-01), Duluk, Jr.
patent: 5990904 (1999-11-01), Griffin
Schilling et al (“A New Simple and Efficient Antialiasing with Subpixel Masks” :ACM-0-89791-436-8/91/007/0133, 1991.*
Hearn & Baker: Computer Graphics, 1994 : Section 4-8.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Z-buffer based interpenetrating object detection for... does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Z-buffer based interpenetrating object detection for..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Z-buffer based interpenetrating object detection for... will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-3199819

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.