Dirt map method and apparatus for graphic display system

Computer graphics processing and selective visual display system – Computer graphics processing – Three-dimension

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C345S582000

Reexamination Certificate

active

07071937

ABSTRACT:
A method and system for map based per-pixel specularity modulation of a surface in a real time 3D graphics renderer through the use of interpolated specularity function or environmental map values. One or more functional modules calculate a pair of specular light intensity values or color values. Each specularity value is representative of the specular light reflected by the given pixel at an extreme surface reflectance characteristic, i.e. one may represent reflection from a very smooth surface while the other represents reflection from a very rough surface. A specularity modulation, or dirt map, value is arrived at by either a procedural calculation based on surface offset coordinates or by retrieval from a two-dimensional map contained in a texture memory. The specularity modulation value is then used as a weight to interpolate the pair of specularity values. The resulting interpolated specularity value is then optionally scaled by the modulation value (or a derivative thereof) to produce a final specularity value. This final specularity intensity or color value is then passed to a lighting unit that modulates pixel color appropriately to include the given specular light.

REFERENCES:
patent: 4855934 (1989-08-01), Robinson
patent: 5253339 (1993-10-01), Wells et al.
patent: 5561746 (1996-10-01), Murata et al.
patent: 5638499 (1997-06-01), O'Connor et al.
patent: 5659671 (1997-08-01), Tannenbaum et al.
patent: 5673374 (1997-09-01), Sakaibara et al.
patent: 5808619 (1998-09-01), Choi et al.
patent: 5835220 (1998-11-01), Kazama et al.
patent: 5936613 (1999-08-01), Jaeger et al.
patent: 6175367 (2001-01-01), Parikh et al.
patent: 6226005 (2001-05-01), Laferriere
patent: 6226006 (2001-05-01), Collodi
patent: 6234901 (2001-05-01), Nagoshi et al.
patent: 6251011 (2001-06-01), Yamazaki
patent: 6290604 (2001-09-01), Miyamoto et al.
patent: 6342885 (2002-01-01), Knittel et al.
patent: 6515674 (2003-02-01), Gelb et al.
patent: 6654013 (2003-11-01), Malzbender et al.
Moller et al., “Real-Time Rendering,” 1999, A K Peters, Ltd., chapters 4 and 5.
Moller et al.; Real Time Rendering, A. K. Peters, Ltd., 1999.
Foley et al.; “Computer Graphics: Principles and Practice” pp. 736-744, 866-869, 1996.
PCT Search Report; Form PCT/1SA/220, mailed Jul. 28, 1999, pp. 1-4.
Printout of Evans & Sutherland E&S Harmony Image home page at website http://www.es.com/image-generators/harmony.html, Apr. 6, 1999, 6 pages.
Otmphong.doc, Mortensen, Zach; Mar. 30, 1995, reprinted from Internet.
Fast Phong Shading, Bishop and Weimer, 1986, pp. 103-105, reprinted from Internet.
“Reflectance and Textures of Real-World Surfaces,” Dana et al.,AMC Transactions on Graphics,vol. 18, No. 1, Jan. 1999, pp. 1-34.
“Measuring and Modeling Anisotropic Reflection,” Gregory J. Ward,Computer Graphics,26, 2, Jul. 1992.
“Predicting Reflectance Functions from Complex Surfaces,” Westin et al,Computer Graphics,26, 2, Jul. 1992.
“Diffraction Shaders,” Jos Stam, Alias|wavefront, SIGGRAPH 1999.
“Smooth Transitions between Bump Rendering Algorithms,” Becker et al. undated.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Dirt map method and apparatus for graphic display system does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Dirt map method and apparatus for graphic display system, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Dirt map method and apparatus for graphic display system will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-3600101

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.