Ray based interaction system

Data processing: structural design – modeling – simulation – and em – Electrical analog simulator – Of electrical device or system

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C703S006000, C345S184000

Reexamination Certificate

active

06704694

ABSTRACT:

FIELD OF THE INVENTION
The present invention relates generally to haptic rendering and more particularly to a ray-based haptic rendering technique for touching and feeling arbitrary three-dimensional (3D) polyhedral objects in virtual environments (VEs).
BACKGROUND OF THE INVENTION
As is known in the art, advances in virtual reality and robotics have enabled the human tactual system to be stimulated in a controlled manner through force-feedback devices, also referred to as haptic interfaces. A haptic interface is a device that enables manual interaction with virtual environments or teleoperated remote systems. Such systems are typically used for tasks that are usually performed using hands in the real world.
Force-feedback devices generate computer-controlled forces to convey to the user a sense of natural feel of the virtual environment and objects within it. In this regard, haptic rendering can be defined as the process of displaying computer controlled forces on the user to make the user sense the tactual feel of virtual objects.
Haptic interface systems typically include an end effector or probe with which the user interacts with the haptic interface system. Conventional haptic interface systems represent or model the probe as a single point in a virtual environment. Such haptic systems are thus referred to as point-based systems.
Through the single point representation of the probe, the user is able to explore the shape, surface details, and material properties of the virtual objects. Since the probe is represented as a point, the net force is the only haptic feedback that can be sent to the user. For exploring the shape and surface properties of objects in virtual environments (VEs), point-based techniques provide the users with force feedback similar to that which the users would experience when exploring the objects in real environments with the tip of a stick. While point-based haptic systems are computationally efficient, they only enable the user feel interaction forces, not the torques. Thus, one problem with the point-based techniques is that they fail to simulate tool-object interactions that involve multiple constraints.
To simulate real tool-object interactions, the computational model of the simulated tool cannot be reduced to a single point since the simulated tool must be able to contact multiple objects and/or different points of the same object simultaneously as does a real tool. Moreover, the resulting reaction torques have to be computed and reflected to the user to make the simulation of haptic interactions more realistic.
Several haptic rendering techniques have been developed to render 3-D objects. Just as in computer graphics, the representation of 3-D objects can be either surface-based or volume-based for the purposes of computer haptics. While the surface models are based on parametric or polygonal representations, volumetric models are made of voxels.
Although a single point is not sufficient for simulating the force and torque interactions between three-dimensional (3-D) objects, one approach to allow such simulation is to use a group of points. For example voxel-based approaches for six degree-of-freedom (6-DOF) haptic rendering have been proposed. In this approach, static objects in a scene are divided into voxels and the probe is modeled as a set of surface points. Then, multiple collisions are detected between the surface points of the probe and each voxel of the static object to reflect forces based on a tangent-plane force model. A tangent-plane whose normal is along the direction of the collided surface point is constructed at the center of each collided voxel. Then, the net force and torque acting on the probing object is obtained as the summation of all force/torque contributions from such point-voxel intersections. Although this approach enables 6-DOF haptic interactions with static rigid objects, its extension to dynamical and deformable objects would significantly reduce the haptic update rate because of the computational load. Moreover, it is difficult to render thin or small objects with this approach.
Another problem with conventional point-based and voxel-based systems, however, is that they do not allow realistic representations of side collisions such as a collision between an object and a portion of the probe other than an end point. Such systems also fail to provide realistic representations of tools having a length such as surgical instruments, for example.
It would, therefore, be desirable to provide a haptic system in which the probe is not modeled as a single point. It would also be desirable to provide a haptic system which computes forces due to collisions between different portions of the probe and one or more virtual objects. It would further be desirable to provide a haptic system which computes and displays reaction forces and torques. It would be still further desirable to provide a ray-based haptic rendering technique that enables the user to touch and feel objects along the length of a probe.
SUMMARY OF THE INVENTION
A force-feedback system includes a probe modeled as line segment or ray in a virtual environment.
With this arrangement a ray-based force-feedback system which provides representations between an object and a portion of the probe including but not limited to an end point is provided. The system can thus provide a realistic representation of a tool, such as a surgical instrument, having a length. By connecting a pair of force-feedback devices, the ray-based force-feedback system of the present invention provides a system in which the user is exposed to torques in addition to the forces, both of which are essential in simulating tool-object interactions. By allowing a user to explore an object with a probe having a length, the haptic perception of some 3D objects using the ray-based rendering technique of the present invention is better than the existing point-based techniques.
The ray-based haptic rendering technique of the present invention enables the user to touch and feel convex polyhedral objects with a line segment model of the probe. The ray-based haptic rendering technique of the present invention not only computes the forces due to collisions between the probe and virtual objects, but also the torques that are required to be displayed in simulating many tool-handling applications. Since the real-time simulation of haptic interactions (force/torque) between a 3D tool and objects is computationally quite expensive, ray-based rendering can be considered as an intermediate step towards achieving this goal by simplifying the computational model of the tool. By modeling the probe as a line segment and utilizing the ray-based rendering technique of the present invention, users have a more rapid haptic perception of 3D convex objects than when the probe is modeled as a point.
Haptic systems which model a probe and objects using ray-based techniques have several advantages over the conventional haptic systems which model probes using point-based techniques. First of all, side collisions between the simulated tool and the 3D objects can be detected. Thus, a user can rotate the haptic probe around the corner of the object in continuous contact and get a better sense of the object's shape. Second, ray-based rendering provides a basis for displaying torques to the user. Using the ray-based rendering technique, one can compute the contact points, depth of penetration, and the distances from the contact points to both the ends of the probe. Then, this information can be used to determine the forces and torques that will be displayed to the user. Third, the ray that represents the probe can be extended to detect the collisions with multiple layers of an object. This is especially useful in haptic rendering of compliant objects (e.g. soft tissue) or layered surfaces (e.g. earth's soil) where each layer has different material properties and the forces/torques depend on the probe orientation. Fourth, it enables the user to touch and feel multiple objects at the same time. If the task involves the sim

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Ray based interaction system does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Ray based interaction system, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Ray based interaction system will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-3199022

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.