Method and system for remote control of mobile robot

Data processing: generic control systems or specific application – Specific application – apparatus or process – Robot control

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C700S245000, C700S248000, C318S628000, C600S595000, C434S262000, C434S267000, C345S156000, C345S184000

Reexamination Certificate

active

06535793

ABSTRACT:

FIELD OF THE INVENTION
The present invention relates generally to the remote control of a mobile robot, and deals more particularly with methods of and systems for tele-operating a robot with an intuitive graphical interface.
BACKGROUND
This invention has utility with the remote control of a wide variety of tele-operated robots and vehicles. While the description provided herein describes the methods and systems of the present invention in relationship to a specific mobile robot, the invention is not so limited. One of skill in the art will recognize that the methods and systems described herein have broad applicability for the remote control of robotic devices.
As robots become increasingly common and capable, there will be an increasing need for an intuitive method of and system for remotely controlling the robots. For example, users may have remote access to robots with which they are otherwise unfamiliar. Just as a licensed driver feels comfortable operating a rental car she has never before encountered, so should she feel comfortable remotely operating an unfamiliar robot; to date, this has not been possible.
While in many situations a robot can be operated locally with the user in visual contact with the robot, in many other situations it is advantageous to have the robot tele-operated. For example, in situations where the robot must operate in hazardous or dangerous conditions—e.g., the transport of hazardous materials, search and rescue operations, military and law enforcement applications—tele-operation of the robot is particularly beneficial.
In some existing systems, a camera is carried by a robot and pictures of the view seen by the camera are transmitted by a communications link to a remote control station and reproduced there on a display screen to give the operator some visual information on the vehicle's environment. In yet other existing systems, users painstakingly build maps or detailed floor plans of the robot's environment in order to remotely navigate. Because of compounding errors generated by such systems, these systems are often inadequate.
The most difficult systems to use are interfaces in which the user specifies a velocity, effectively using a joystick-like interface. This approach suffers over communications lines, since there is a time lag between when a picture is taken and when a user can see it, and again there is a time lag between when a user stops a movement with a joystick and when that command is received by the robot. Typically this kind of interface suffers from “overshooting,” where the user stops commanding a motion when hey see the image that shows the robot at its desired location. However, since that image as aged, the robot has already actually overshot the desired location. Since the command to stop moving also takes time to arrive at the destination, the robot continues to overshoot while this command is in transit.
One solution to the overshooting problem is the inclusion of simple, clickable arrows on an interface to command a fixed amount of movement by specifying travel time or distance. This simple interface has the desirable characteristic that it provides an absolute motion command to the robot which will not suffer from time lag issues; however, this interface provides limited functionality.
Yet another possible solution includes using fixed cameras that point to an immovable target and then allowing a user to select locations for a robot to move to within the fixed image. This solution lacks the ability to arbitrarily position and rotate the camera in three-dimensional space. In addition, this solution requires placing cameras in all locations to which the robot can travel, and therefore is an inflexible and expensive solution.
Because existing systems are often difficult to control, additional solutions have been proposed. For example, in U.S. Pat. No. 6,108,031, a user is given “virtual reality” glasses (or a headset) to allow three-dimensional information to be transmitted to the user. Using this enhanced visual information, the user then remotely manipulates the vehicle using a control box.
There are, however, limitations to these methods of remotely controlling a robot or vehicle. As mentioned above, in many of these cases, it is assumed that real-time visual information is being transmitted from the camera to the user and that the user is able to transmit real-time control information back to the robot. For certain types of communication links, however, such real-time communication is not possible. Specifically, Internet connections can vary dramatically by the speed of the connection (e.g. DSL, cable modem, dial-up connections) and by the current level of Internet traffic. Therefore, for Internet-connected robots, such real-time transmission cannot be guaranteed.
SUMMARY OF THE INVENTION
The object of the invention is, therefore, to provide a method for the intuitive tele-operation of a robot.
Another object of the invention is to provide an intuitive user interface for remotely-controlling a robot.
Yet another object of the invention is to provide a method and system for remotely controlling a robot particularly suited for systems with asynchronous communication.
It is an object of the invention to provide additional information to the user in a graphical overlay to improve navigation of a remotely controlled robot.
Other objects and advantages of the invention will be apparent from the following description of a preferred embodiment of the invention and from the accompanying drawings and claims.


REFERENCES:
patent: 4202037 (1980-05-01), Glaser et al.
patent: 5471560 (1995-11-01), Allard et al.
patent: 5511147 (1996-04-01), Abdel-Malek
patent: 5652849 (1997-07-01), Conway et al.
patent: 5675229 (1997-10-01), Thorne
patent: 5984880 (1999-11-01), Lander et al.
patent: 6088020 (2000-07-01), Mor
patent: 6108031 (2000-08-01), King et al.
patent: 6113395 (2000-09-01), Hon
patent: 2001/0020200 (2001-09-01), Das et al.
patent: 2001/0025118 (2001-09-01), Shahidi
patent: 34 04 202 (1987-05-01), None
patent: 2 128 842 (1984-05-01), None
patent: 11149315 (1999-06-01), None
patent: 2000094373 (2000-04-01), None
patent: WO 99/05580 (1999-02-01), None
Nakai et al., 7 DOF Arm type haptic interface for teleoperation and virtual reality systems, 1998, IEEE, pp. 1266-1271.*
Ohashi et al., The sensor arm and the sensor glove II-Haptic devieces for VR interface, 1999, IEEE, p. 785.*
IRobot Corporation, Coworder, no date, Internet.*
Greham, March of the robots, 2002, Internet, pp. 175-185.*
Star Tech, Tech firms showing off gadgets for consumers, no date, Internet.*
IRobot Corporation, Introducing the IRobot-LE, 2000, Internet, pp. 1-3.*
Corbett et al. “A Human Factors Tested for Ground-Vehicle Telerobotics Research,” IEEE Proceedings -1990 Southeastcon pp. 618-620 (US).*
“Tactical Robotic Vehicle Aids in Battlefield Surveillance” 2301 NTIS Tech Notes, Dec. 1990, Springfield, VA (US) p. 1061.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Method and system for remote control of mobile robot does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Method and system for remote control of mobile robot, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Method and system for remote control of mobile robot will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-3049264

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.