Three-dimensional location-based texture transfers

Computer graphics processing and selective visual display system – Computer graphics processing – Attributes

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C345S420000, C345S428000, C345S552000, C345S581000, C382S285000, C382S293000, C382S300000, C463S032000

Reexamination Certificate

active

08040355

ABSTRACT:
Textures are transferred between different object models using a point cloud. In a first phase, a point cloud in 3-D space is created to represent a texture map as applied to a first, or “source,” object model. In a second phase, a value for a target texel of a texture map associated with a second, or “target,” object model, is determined by identifying the 3-D location on a surface defined by the target object model that maps to the location of the target texel and assigning a value based on the nearest point (or points) to that location in the 3-D point cloud. To the extent that differences between the source and target object models are minor, the texture transfer can be accomplished without loss of information or manual cleanup.

REFERENCES:
patent: 5856829 (1999-01-01), Gray et al.
patent: 6434278 (2002-08-01), Hashimoto
patent: 2006/0017741 (2006-01-01), Sekine et al.
patent: 2006/0232583 (2006-10-01), Petrov et al.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Three-dimensional location-based texture transfers does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Three-dimensional location-based texture transfers, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Three-dimensional location-based texture transfers will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-4254722

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.