Computer graphics processing and selective visual display system – Display peripheral interface input device
Reexamination Certificate
1997-06-06
2001-06-12
Nguyen, Chanh (Department: 2675)
Computer graphics processing and selective visual display system
Display peripheral interface input device
C345S161000
Reexamination Certificate
active
06246390
ABSTRACT:
BACKGROUND OF THE INVENTION
The present invention relates generally to interface devices between humans and computers, and more particularly to computer input devices having three-dimensional input.
Virtual reality computer systems provide users with the illusion that they are part of a “virtual” environment. A virtual reality system will typically include a computer processor, such as a personal computer or workstation, specialized virtual reality software, and virtual reality I/O devices such as head mounted displays, sensor gloves, three dimensional (“3D”) pointers, etc.
One common use for virtual reality computer systems is for training. In many fields, such as aviation and vehicle and systems operation, virtual reality systems have been used successfully to allow a user to learn from and experience a realistic “virtual” environment. The appeal of using virtual reality computer systems for training relates, in part, to the ability of such systems to allow trainees the luxury of confidently operating in a highly realistic environment and making mistakes without “real world” consequences. Thus, for example, a trainee pilot or automobile driver can learn to operate a vehicle using a virtual reality simulator without concern for accidents that would cause injury, death and/or property damage in the real world. Similarly, operators of complex systems, e.g., nuclear power plants and weapons systems, can safely practice a wide variety of training scenarios that would risk life or property if performed in reality.
For example, a virtual reality computer system can allow a doctor-trainee or other human operator or user to “manipulate” a scalpel or probe within a computer-simulated “body”, and thereby perform medical procedures on a virtual patient. In this instance, the I/O device which is typically a 3D pointer, stylus, or the like is used to represent a surgical instrument such as a scalpel or probe. As the “scalpel” or “probe” moves within a provided space or structure, results of such movement are updated and displayed in a body image displayed on the screen of the computer system so that the operator can gain the experience of performing such a procedure without practicing on an actual human being or a cadaver.
In other applications, virtual reality computers systems allow a user to handle and manipulate the controls of complicated and expensive vehicles and machinery. For example, a pilot or astronaut in training can operate a fighter aircraft or spacecraft by manipulating controls such as a control joystick and other buttons and view the results of controlling the aircraft on a virtual reality simulation of the aircraft flying. In yet other applications, a user can manipulate objects and tools in the real world, such as a stylus, and view the results of the manipulation in a virtual reality world with a “virtual stylus” viewed on a screen, in 3-D goggles, etc.
For virtual reality systems to provide a realistic (and therefore effective) experience for the user, sensory feedback and manual interaction should be as natural as possible. As virtual reality systems become more powerful and as the number of potential applications increases, there is a growing need for specific human/computer interface devices which allow users to interface with computer simulations with tools that realistically emulate the activities being represented within the virtual simulation. Such procedures as laparoscopic surgery, catheter insertion, and epidural analgesia should be realistically simulated with suitable human/computer interface devices if the doctor is to be properly trained. Similarly, a user should be provided with a realistic interface for manipulating controls or objects in a virtual reality simulation to gain useful experience.
While the state of the art in virtual simulation and medical imaging provides a rich and realistic visual feedback, there is a great need for new human/computer interface tools which allow users to perform natural manual interactions with the computer simulation. For medical simulation, there is a strong need to provide doctors with a realistic mechanism for performing the manual activities associated with medical procedures while allowing a computer to accurately keep track of their actions. There is also a need in other simulations to provide virtual reality users with accurate and natural interfaces for their particular tasks.
In addition to sensing and tracking a user's manual activity and feeding such information to the controlling computer to provide a 3D visual representation to the user, a human interface mechanism should also provide force or tactile (“haptic”) feedback to the user. The need for the user to obtain realistic tactile information and experience tactile sensation is extensive in many kinds of simulation. For example, in medical/surgical simulations, the “feel” of a probe or scalpel simulator is important as the probe is moved within the simulated body. It would invaluable to a medical trainee to learn how an instrument moves within a body, how much force is required depending on the operation performed, the space available in a body to manipulate an instrument, etc. In simulations of vehicles or equipment, force feedback for controls such as a joystick can be necessary to realistically teach a user the force required to move the joystick when steering in specific situations, such as in a high acceleration environment of an aircraft. In virtual world simulations where the user can manipulate objects, force feedback is necessary to realistically simulate physical objects; for example, if a user touches a pen to a table, the user should feel the impact of the pen on the table. An effective human interface not only acts as an input device for tracking motion, but also as an output device for producing realistic tactile sensations. A “high bandwidth” interface system, which is an interface that accurately responds to signals having fast changes and a broad range of frequencies as well as providing such signals accurately to a control system, is therefore desirable in these and other applications.
There are number of devices that are commercially available for interfacing a human with a computer for virtual reality simulations. There are, for example, such 2-dimensional input devices such as mice, trackballs, and digitizing tablets. However, 2-dimensional input devices tend to be awkward and inadequate to the task of interfacing with 3-dimensional virtual reality simulations.
Other 3-dimensional interface devices are available. A 3-dimensional human/computer interface tool sold under the trademark Immersion PROBE™ is marketed by Immersion Human Interface Corporation of Santa Clara, Calif., and allows manual control in 3-dimensional virtual reality computer environments. A pen-like stylus allows for dexterous 3-dimensional manipulation, and the position and orientation of the stylus is communicated to a host computer. The Immersion PROBE has six degrees of freedom which convey spatial coordinates (x, y, z) and orientation (roll, pitch, yaw) of the stylus to the host computer.
While the Immersion PROBE is an excellent 3-dimensional interface tool, it may be inappropriate for certain virtual reality simulation applications. For example, in some of the aforementioned medical simulations three or four degrees of freedom of a 3-dimensional human/computer interface tool is sufficient and, often, more desirable than five or six degrees of freedom because it more accurately mimics the real-life constraints of the actual medical procedure. More importantly, the Immersion PROBE does not provide force feedback to a user and thus does not allow a user to experience an entire sensory dimension in virtual reality simulations.
In typical multi-degree of freedom apparatuses that include force feedback, there are several disadvantages. Since actuators which supply force feedback tend to be heavier and larger than sensors, they would provide inertial constraints if added to a device such as the Immersion PROBE. There is also the problem of coupled actuators.
Immersion Corporation
Nguyen Chanh
Riegel James R.
Tucker Guy V.
LandOfFree
Multiple degree-of-freedom mechanical interface to a... does not yet have a rating. At this time, there are no reviews or comments for this patent.
If you have personal experience with Multiple degree-of-freedom mechanical interface to a..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Multiple degree-of-freedom mechanical interface to a... will most certainly appreciate the feedback.
Profile ID: LFUS-PAI-O-2464097