Interface apparatus for dynamic positioning and orientation...

Computer graphics processing and selective visual display system – Computer graphics processing – Three-dimension

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C345S581000, C700S245000, C700S264000, C382S153000, C382S154000

Reexamination Certificate

active

06642922

ABSTRACT:

BACKGROUND OF THE INVENTION
1. Field of the Invention
The present invention relates to an apparatus for providing the interface in performing operations required in each step of designing, developing, and controlling a robot.
2. Description of the Related Art
Recently, various robots for performing operations for people such as an industrial robot, an experiment/laboratory robot, etc. have been developed for practical use in various fields. Among these robots, an arm-type robot having a manipulator (robot arm) has the feature that it can perform manual operations.
An end effector is provided at the tip of the manipulator. It directly works on a work object, holds it, or moves it. A typical end effector is a gripper (robot hand) for holding an object.
A computer system for performing a robot simulation has the function of designing a robot, performing a simulation operation, visualizing an operation result, etc. The simulation object includes kinematics, dynamics, control, etc. References to the robot simulation can be ‘Basic Robot Engineering Control’ (by Kensuke Hasegawa and Ryosuke Masuda, published by Shokodo), ‘Robotics’ (by Shigeki Tohyama, published by Daily Industry News Press), etc.
The dynamics simulation of an arm-type robot can be performed mainly in two methods, that is, a kinematics simulation method and an inverse kinematics simulation method. In the kinematics simulation method, the amount of rotation of the joint angle of a manipulator is input as input data, and the data of the position and orientation of an end effector is output. On the other hand, in the inverse kinematics simulation method, the position and orientation of the end effector are input, and the amount of rotation of the joint angle is output.
The position and orientation of an end effector, and the rotation angle of a joint angle are designed as coupled coordinate systems in a three-dimensional space. Each of the parameters of the position, orientation, and rotation angle is represented as a relative parameter in the coupled coordinate systems. A simulation result obtained from these input data is normally visualized in three-dimensional computer graphics (three-dimensional CG) for a visual check.
However, there is the following problem with the conventional technology.
When a robot is to be positioned in a simulation, it is necessary to input as input data the amount of rotation of a joint angle, or the position and orientation of an end effector. To determine the input data, the rotation angle and moving distance at each coordinate system should be set. However, it is difficult for an operator to predict the rotation and movement in the three-dimensional space.
To easily predict the operations, a coordinate system representing the position and orientation of the end effector is generally defined, and a line parallel to the coordinate axis of the coordinate system, or a plane parallel to the plane containing two coordinate axes can be set. The line or the plane is assumed to be a movable range of the end effector, and the positioning in. a three-dimensional space can be simplified into the problem in the one- or two-dimensional space.
In the above described positioning method, it is relatively easy to predict the movement of the end effector on the predetermined line parallel to the coordinate axis of the coordinate system, or on the predetermined plane parallel to the plane containing two coordinate axes. However, for other movements, it is necessary to input all parameters for the position and orientation after setting them in the three-dimensional space.
Therefore, in this method, it is difficult to predict the position based on the data other than the predetermined coordinate system, and a plurality of applicable coordinate systems should be preliminarily set. However, to perform an arbitrary positioning process in a three-dimensional space, it is necessary but impossible to set a number of coordinate systems with rotations of the coordinate system taken into account.
Since the display window is represented in a two-dimensional plane although a simulation result is represented in the three-dimensional CG on the screen, there is the problem that it is not suitable for displaying a positioning result in the three-dimensional space.
SUMMARY OF THE INVENTION
The present invention aims at providing an interface apparatus capable of predicting a movement of a robot when the robot is arbitrarily positioned in the three-dimensional space, thereby improving the efficiency of the operations.
According to the first aspect of the present invention, the interface apparatus includes a plane designation unit and a display unit. The plane designation unit designates an arbitrary plane in a three-dimensional space. The display unit displays the image of a robot on the designated plane in the graphics representation.
When a specific portion of a robot is operated, the plane designation unit specifies a plane containing the current position and the target position of the portion, and the display unit displays the plane.
According to the second aspect of the present invention, the interface apparatus includes a mapping unit, a display unit, a plane designation unit, and a change unit. The mapping unit associates a plane in a three-dimensional space with a display plane. The display unit displays an image of the robot on the plane in the three-dimensional space on the display plane in the graphics representation. The plane designation unit designates a parameter indicating the position and orientation of the plane in the three-dimensional space. The change units changes the position and orientation of the plane by changing the parameter.


REFERENCES:
patent: 3892051 (1975-07-01), Bunker
patent: 4831548 (1989-05-01), Matoba et al.
patent: 4987527 (1991-01-01), Hamada et al.
patent: 5046022 (1991-09-01), Conway et al.
patent: 5253189 (1993-10-01), Kramer
patent: 5488689 (1996-01-01), Yamato et al.
patent: 5495410 (1996-02-01), Graf
patent: 5581666 (1996-12-01), Anderson
patent: 5771310 (1998-06-01), Vannah
patent: 6023276 (2000-02-01), Kawai et al.
patent: 6104412 (2000-08-01), Tsutsuguchi et al.
patent: 6226567 (2001-05-01), Kaneko et al.
patent: 6556206 (2003-04-01), Benson et al.
Kensuke Hasegawa et al., Principles of Robotics—Modelling, Control and Sensing-, Nov. 10, 1995, pp. 52-56.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Interface apparatus for dynamic positioning and orientation... does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Interface apparatus for dynamic positioning and orientation..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Interface apparatus for dynamic positioning and orientation... will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-3176194

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.