System and method for mapping textures onto surfaces of...

Computer graphics processing and selective visual display system – Computer graphics processing – Graph generating

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C348S620000

Reexamination Certificate

active

06236405

ABSTRACT:

FIELD OF THE INVENTION
This invention relates generally to the field of computer graphics and more particularly to the generation and processing of textures in computerized graphical images.
BACKGROUND
Mapping textures onto surfaces of computer-generated objects is a technique which greatly improves the realism of their appearance. For instance, many surfaces are characterized by surface roughness which in a digitized image manifests itself in the form of local variations in brightness from one pixel to the next. Unfortunately, altering pixels in computer generated images to generate surface textures in such images imposes high computational demands and, even worse, tremendous memory bandwidth requirements on the graphics system. Tight cost constraints imposed upon the design of most products in conjunction with ever increasing user expectations make the design of a powerful texture mapping unit a difficult task.
In the present specification, we use the term “texture” as a synonym for any image or structure to be mapped onto an object, unless explicitly stated otherwise. During the rasterization process, mapping images (textures) onto objects can be considered as the problem of determining a screen pixel's projection on the image (referred to herein as the pixel's “footprint”) and computing an average value which best approximates the correct pixel color. In real-time environments, where several tens of millions of pixels per second are issued by fast rasterizing units, hardware expenses for image mapping become substantial and algorithms must therefore be chosen and adapted very carefully.
One approach is to create a set of prefiltered images, which are selected according to the level of detail (the size of the footprint), and used to interpolate the final pixel color. The most common method is to organize these maps as a mipmap as proposed by L. Williams, “
Pyramidal Parametrics”,
Proceedings of SIGGRAPH '83, Computer Graphics, vol. 17, no. 3, July 1983, pp. 1-11. In a mipmap, the original image is denoted as level 0. In level 1, each entry holds an averaged value and represents the area of 2×2 texels. As used herein the term “texel” (texture element) refers to a picture element (pixel) of the texture. This is continued until the top-level is reached, which has only one entry holding the average color of the entire texture. Thus, in a square mipmap, level n has one fourth the size of level n−1.
Mipmapping in a traditional implementation either requires a parallel memory system or sequential accesses to the texture buffer and is therefore either expensive or slow. One way to reduce data traffic is image compression. Its application to texture mapping, however, is difficult since the decompression must be done at pixel frequency.
There is accordingly a need for a texture mapping system which implements mipmapping in a rapid and/or a cost efficient manner. There is a further need for a texture mapping system which provides significant image enhancement at high rendering speeds. Moreover, particularly with respect to systems where cost is of concern, there is a need for an efficient compression scheme which reduces the amount of data required to be stored and accessed by a texture mapping system.
SUMMARY
Embodiments of the present invention advantageously enhance computer generated images by texture mapping in a rapid and efficient manner. In a principle aspect, the present invention advantageously provides a footprint assembly system which provides significant image enhancement at high rendering speeds. Embodiments employing the principles of the footprint assembly system described herein advantageously provide significant image enhancement in an efficient manner by approximating the projection of a pixel on a texture by a number N of square mipmapped texels.
In another aspect, the present invention provides a data compression system to reduce memory storage and bandwidth requirements in a simple, yet fast manner, and to therefore reduce system costs. Still other aspects of the present invention provide novel hardware architectures for texture mapping. In one embodiment, a hardware texturing unit is provided to operate on compressed textures. In certain embodiments the textures may be compressed by way of the aforementioned novel compression system. Such an embodiment advantageously decreases system cost by reducing the amount of memory storage and bandwidth required to store texture maps. In a second embodiment, a hardware texturing unit is provided to operate on uncompressed textures. Such a unit advantageously incorporates certain interpolation techniques to provide high image quality in systems where high image quality is required.
The hardware units described herein benefit from the integration of arithmetic units and large memory arrays on the same chip. This allows exploitation of the enormous transfer rates internal to a chip and provides an elegant solution to the memory access bottleneck of high-quality texture mapping. In addition to achieving higher texturing speed at lower system costs, such hardware units incorporate new functionality such as detail mapping and footprint assembly to produce higher quality images at still real-time rendering speed. Other functions which may be integrated into certain units include environment and video mapping. Such hardware units may consequently take the form of extremely versatile texturing coprocessors.


REFERENCES:
patent: 4580134 (1986-04-01), Campbell et al.
patent: 5019908 (1991-05-01), Su
patent: 5461712 (1995-10-01), Chelstowski et al.
patent: 5606650 (1997-02-01), Kelly et al.
patent: 5651104 (1997-07-01), Cosman
patent: 5831624 (1998-11-01), Tarolli et al.
patent: 5903276 (1999-05-01), Shiraishi
P. Heckbert, “Color Image Quantization for Frame Buffer Display”, Proceedings of SIGGRAPH '82, Computer Graphics, vol. 16, No. 3, Jul. 1982, pp. 297-307.
L. Williams, “Pyramidal Parametrics”, Proceedings of SIGGRAPH '83, Computer Graphics, vol. 17, No. 3, Jul. 1983, pp. 1-11.
F. C. Crow, “Summed-Area Tables for Texture Mapping”, Proceedings of SIGGRAPH '84, Computer Graphics, vol. 18, No. 3, Jul. 1984, pp. 207-212.
A. Glassner, “Adaptive Precision in Texture Mapping”, Proceedings of SIGGRAPH '86, Computer Graphics, vol. 20, No. 4, Aug. 1986, pp. 297-306.
G. Campbell, et al., “Two Bit/Pixel Full Color Encoding”, SIGGRAPH '86 Conference Proceedings, Computer Graphics, vol. 20, No. 4, Aug. 1986, pp. 215-223.
M. F. Deernig, et al., “FBRAM: A New Form of Memory Optimized for 3D Graphics”, Proceedings of SIGGRAPH '94, Jul. 1994, pp. 167-174.
G. Knittel, et al., “GRAMMY: High Performance Graphics Using Graphics Memories”, Proceedings of the International Workshop on High Performance Computing for Computer Graphics and Visualization, Swansea, Jul. 1995, pp. 33-48.
W. Strasser, et al., “High Performance Graphics Architectures”5thInternational Conference of Computer Graphics and Visualization, St. Petersburg, Russia, Jul., 1995.
A. Schilling, et al., “TEXTRAM—A Smart Memory for Tecturing”, IEEEE Computer Graphics a Applications, vol. 16, No. 3, May 1996, pp. 32-41.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

System and method for mapping textures onto surfaces of... does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with System and method for mapping textures onto surfaces of..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and System and method for mapping textures onto surfaces of... will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-2540814

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.