Occlusion reducing transformations for three-dimensional...

Computer graphics processing and selective visual display system – Computer graphics processing – Three-dimension

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C345S418000, C345S427000

Reexamination Certificate

active

10884978

ABSTRACT:
In a data processing system that executes a program of instructions, a method for generating a detail-in-context presentation of a three-dimensional information representation comprising the steps of selecting a object-of-interest in the information representation; selecting a viewpoint; selecting a path from the viewpoint to the object-of-interest; and, displacing objects in the information representation away from the path to locations within the information representation where the objects remain visible when viewed from the viewpoint yet do not occlude the object-of-interest when viewed from the viewpoint to thereby generate the detail in context view.

REFERENCES:
patent: 5689628 (1997-11-01), Robertson
patent: 5999879 (1999-12-01), Yano
patent: 6160553 (2000-12-01), Robertson et al.
patent: 6842175 (2005-01-01), Schmalstieg et al.
T. Alan Keahey, “The Generalized Detail-In-Context Problem”, Information Visualization 1998, Proceedings., IEEE Symposium on Oct. 20-21, 1998 pp. 44-51, 152.
Carpendale et al., “3-Dimensional Pliable Surfaces: For the Effective Presentation of Visual Information”, Dec. 1995. Proceedings of the 8th annual ACM symposium on User interface and software technology.
Carpendale, M.S.T., Cowperthwaite, D.J., and Fracchia, F.D., “Extending distortion viewing from 2D to 3D”, Simon Fraser Univ., Burnaby, BC; Pub in Computer Graphics and Applications, IEEE, Publication Date: Jul./Aug. 1997, vol. 17, Issue: 4, On pp. 42-51.
Viega, J. and Conway, M.J. and Williams, G. and Pausch, R.—“3D magic lenses”, Proceedings of the 9th annual ACM symposium on User interface software and technology, pp. 51-58, Pub 1996 ACM Press New York, NY, USA.
Cowperthwaite, D. J., “Occlusion Resolution Operators for Three-Dimensional Detail-In-Context” (Burnaby, British Columbia: Simon Fraser University, 2000).
Capendale, M.S.T., “A Framework for Elastic Presentation Space” (Burnaby, British Columbia: Simon Fraser University, 1999).
Carpendale, M.S.T., et al., “Exploring Distinct Aspects of the Distortion Viewing Paradigm”, Technical Report TR 97-08, Simon Fraser University, Burnaby, BC, Sep. 1997.
Cowperthwaite, D.J., et al., “Visual Access for 3D Data”, Proceedings of ACM CHI 96 Conference, pp. 175-176, 1996.
Keahey, T.A., “Visualization of High-Dimensional Clusters Using Nonlinear Magnification”, Technical Report LA-UR-98-2776, Los Alamos National Laboratory, 1998.
Tigges, M., et al., “Generalized Distance Metrics for Implicit Surface Modeling”, Proceedings of the Tenth Western Computer Graphics Symposium, Mar. 1999.
Bossen, F.J., “Anisotropic Mesh Generation With Particles”, Technical Report CMU-CS-96-134, CS Dept, Carnegie Mellon University, May 1996.
Bossen, F.J., et al., “A Pliant Method for Anisotropic Mesh Generation”, 5th Intl. Meshing Roundtable, pp. 63-74, Oct. 1996.
Wilson, et al., “Direct Volume Rendering Via 3D Textures”, Technical Report UCSC-CRL-94-19, University of California, Santa Cruz, Jack Baskin School of Engineering, Jun. 1994.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Occlusion reducing transformations for three-dimensional... does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Occlusion reducing transformations for three-dimensional..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Occlusion reducing transformations for three-dimensional... will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-3830334

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.