Method and system for classifying user objects in a three-dimens

Computer graphics processing and selective visual display system – Computer graphics processing – Three-dimension

Patent

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

345418, G06T 1700

Patent

active

061115819

ABSTRACT:
A method and system for classifying user objects in a three-dimensional environment on a display of a computer system is disclosed. The method and system comprises providing a set of standardized classes of user objects and defining the standardized classes based upon a users needs. The method and system is directed toward a classification for objects relevant to the tasks of organizing the 3D environment, navigating through the 3D environment, and performing useful work in the 3D user environment in a computer system classification. The distinction between classes of objects in the classification is based on user needs and is reflected in the properties and behaviors of objects as perceived by the users.

REFERENCES:
patent: 5261044 (1993-11-01), Dev et al.
patent: 5276785 (1994-01-01), Mackinlay et al.
patent: 5751931 (1998-05-01), Cox et al.
Mackinlay, Jock D.; Card, Stuart K.; Robertson, George G.; Rapid Controlled Movement Through a Virtual 3D Workspace, Computer Graphics, vol. 24, No. 4, Aug. 1990.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Method and system for classifying user objects in a three-dimens does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Method and system for classifying user objects in a three-dimens, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Method and system for classifying user objects in a three-dimens will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-1254416

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.