Texture addressing circuit and method

Computer graphics processing and selective visual display system – Computer graphics processing – Attributes

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

Reexamination Certificate

active

07057623

ABSTRACT:
A texture addressing circuit and method for calculating texture coordinates according to various texture addressing modes for texture maps having arbitrary sizes. Texture coordinates are calculated from input texture coordinate values the texture addressing circuit receives and that are located in one of a plurality of predefined input ranges. The texture addressing circuit includes a plurality of coordinate calculation circuits to calculate coordinate output values for each of the input coordinate ranges and to provide the values to a selection circuit from which an output texture coordinate is selected. An output texture coordinate is selected based on the sign of the input texture coordinate and the sign of the calculated texture coordinate values.

REFERENCES:
patent: 5230039 (1993-07-01), Grossman et al.
patent: 6295070 (2001-09-01), Wood
patent: 6366290 (2002-04-01), Dye et al.
patent: 6417860 (2002-07-01), Migdal et al.
patent: 6429873 (2002-08-01), Kacevas et al.
patent: 6452603 (2002-09-01), Dignam
patent: 6501482 (2002-12-01), Rosman et al.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Texture addressing circuit and method does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Texture addressing circuit and method, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Texture addressing circuit and method will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-3660216

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.