Method and apparatus for projective texture mapping rendered fro

Patent

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

395130, G06T 1140

Patent

active

058057820

ABSTRACT:
A method and apparatus for generating interactive computer graphics images using projective texture mapping. The projective texture mapping of the present invention involves the mapping of a texture map onto a scene being rendered from the perspective of an arbitrarily positioned and oriented projection light source. The visual effect is as if the texture map were a slide being displayed onto the objects comprising the scene. During this process, homogeneous texture map coordinates are generated for corresponding geometric primitive vertex points. The vertex points of the geometric primitive are defined in terms of a world coordinate system. The homogeneous texture map coordinates of the vertex points are generated using transformation matrices of world coordinates to screen/clip coordinates and world coordinates to light source/texture map coordinates. Texture map coordinates for the remainder of the points of the geometric primitives are derived through interpolation of the vertex homogeneous texture coordinates. Texture map coordinates are generated during a span iteration process by dividing the two spatial homogeneous coordinates by the depth homogeneous coordinate.

REFERENCES:
patent: 4586038 (1986-04-01), Sims et al.
patent: 4709231 (1987-11-01), Sckaibera et al.
patent: 4727365 (1988-02-01), Bunker et al.
patent: 4807158 (1989-02-01), Blanton et al.
patent: 4821212 (1989-04-01), Heartz
patent: 4845651 (1989-07-01), Aizawa et al.
patent: 4928250 (1990-05-01), Greenberg et al.
patent: 4943938 (1990-07-01), Aoshima et al.
patent: 5025405 (1991-06-01), Swanson
patent: 5083287 (1992-01-01), Obata et al.
patent: 5239624 (1993-08-01), Cook et al.
patent: 5268996 (1993-12-01), Steiner et al.
patent: 5282262 (1994-01-01), Kurashige
patent: 5295199 (1994-03-01), Shino
patent: 5305430 (1994-04-01), Glassner
patent: 5307450 (1994-04-01), Grossman
patent: 5317689 (1994-05-01), Nack et al.
patent: 5321797 (1994-06-01), Morton
patent: 5325471 (1994-06-01), Inoue
patent: 5325472 (1994-06-01), Horiuchi et al.
patent: 5333245 (1994-07-01), Vecchione
patent: 5361386 (1994-11-01), Watkins et al.
patent: 5363477 (1994-11-01), Kuragano et al.
patent: 5367615 (1994-11-01), Economy et al.
patent: 5469535 (1995-11-01), Jarvis et al.
patent: 5537638 (1996-07-01), Morita et al.
patent: 5566283 (1996-10-01), Modegi et al.
"Computer Graphics Principles and Practices, Second Edition", Foley et al., Addison-Wesley Publishing Company, pp. 745-756.
"Casting Curved Shadows on Curved Surfaces", Williams, Proceedings of SIGGRAPH '78, pp. 270-274.
Rendering Antialiased Shadows with Depth Maps:, Reeves, Salesin, and Cook, Proceedings of SIGGRAPH '87, pp. 283-291.
"The RenderMan Companion", Steve Upstill, Addison-Wesley Publishing Company, 1990, Plate 10.
"High-Performance Polygon Rendering", Kurt Akeley & Tom Jermoluk, Computer Graphics, vol. 22, No. 4, Aug. 1988, pp. 239-246.
-Peachey, Darwyn R., "Solid Texturing of Complex Surfaces," Siggraph, vol. 19, No. 3, pp. 279-286 (Jul. 22-26, 1985).
-Segal, Mark et al., "Fast Shadows and Lighting Effects Using Texture Mapping," Computer Graphics, 26,2, pp. 239-246 (Jul. 1992).

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Method and apparatus for projective texture mapping rendered fro does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Method and apparatus for projective texture mapping rendered fro, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Method and apparatus for projective texture mapping rendered fro will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-1292427

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.