System and method for calibrating a stereo optical...

Computer graphics processing and selective visual display system – Image superposition by optical means – Operator body-mounted heads-up display

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C345S157000, C348S051000, C359S630000

Reexamination Certificate

active

06753828

ABSTRACT:

TECHNICAL FIELD OF THE INVENTION
The present invention relates generally to camera calibration methods for optical see-through head-mounted display systems for augmented reality. More specifically, the present invention relates to a method for calibrating a monocular optical see-through display (i.e., a display for one eye only) and method for calibrating a stereo optical see-through display in which the displays for both eyes are calibrated in a single procedure.
DESCRIPTION OF RELATED ART
Augmented reality (AR) is a technology in which a user's view of the real world is enhanced or augmented with additional information generated from a computer model. In a typical AR system, a view of a real scene is augmented by superimposing computer-generated graphics on the view such that the generated graphics are properly aligned with real world objects as needed by the application. The graphics are generated from geometric models of both non-existent (virtual) objects and real objects in the environment. In order for the graphics and the video to align properly, the pose and optical properties of the real and virtual cameras must be the same. The position and orientation of the real and virtual objects in some world coordinate system must also be known. The locations of the geometric models and virtual cameras within the augmented environment may be modified by moving its real counterpart. This is accomplished by tracking the location of physical objects and using this information to update the corresponding transformations within the virtual world. This tracking capability may also be used to manipulate purely virtual objects, ones with no real counterpart, and to locate real objects in the environment. Once these capabilities have been brought together, real objects and computer-generated graphics may be blended together, thus augmenting a dynamic real scene with information stored and processed on a computer.
In order to have a working AR system, the display system must be calibrated so that the graphics is properly rendered. More specifically, in order for augmented reality to be effective, the real and computer-generated objects must be accurately positioned relative to each other and properties of certain devices must be accurately specified. This implies that certain measurements or calibrations need to be made at the start of the system. These calibrations involve measuring the pose of various components such as trackers, pointers, cameras, etc. The calibration method in an AR system depends on the architecture of the particular system and the types of components used.
There are two primary modes of display in an AR system which determine the type of calibration needed: (i) video-see-through AR systems; and (ii) optical see-through AR systems. An “optical see-through system” is defined herein as a combination of a see-through head-mounted display and a human eye. This display and eye combination will be referred to herein as a virtual camera of the AR display system.
One method for camera calibration for use with video see-through systems is described, for example, in the article by M. Tuceryan, et al, entitled “Calibration requirements and procedures for a monitor-based augmented reality system,”
IEEE Transactions on Visualization and Computer Graphics
, 1(3):255-273, September 1995. This calibration method is based on using the correspondence between known 3-D points and the 2-D positions of their projected images positions, from which camera parameters are estimated. This calibration protocol is for a video-see-through system in which it is assumed that there is access to the picture points (pixels) that can be selected and whose image coordinates could be obtained. This protocol can be used in a video-see-through display system because the image digitized by the video camera can be accessed and used to analyze the input images.
Calibration procedures for optical see-through systems present a challenge because, in contrast to video see-through systems, there is no direct access to the image data that is used for calibration. Indeed, with an optical see-through system, the images of the scene are formed on the retina of the human user's eye and, consequently, there is no direct access to the image pixels. Accordingly, different approaches are needed for calibrating optical see-through systems. A difficult task of calibrating an optical see-through system is devising a proper user interaction paradigm for collecting the necessary data for performing the calibration. There have been previous attempts to devise such interaction methods with various degrees of success. One method uses multiple point configurations in the world in order to collect the calibration data (see, e.g., the article by A. Janin et al., entitled “Calibration of head-mounted displays for augmented reality applications,” In
Proc. of VRAIS'
93, pages 246-255, 1993.) Another interactive approach for calibrating an optical see-through AR system involves having the user interactively align a model of a 3D object with multiple configurations with the physical object in the display (see, e.g., Erin McGarrity and Mihran Tuceryan, “A method for calibrating see-through head-mounted displays for AR,” In 2
nd
International Workshop on Augmented Reality
(
IWAR '
99), pages 75-84, San Francisco, Calif., October 1999.) This approach allows the user to adjust camera parameters interactively until the user is satisfied that a 3D model of a calibration jig is aligned properly with the physical calibration jig itself.
Such interactive calibration schemes, which require multipoint configurations and the simultaneous alignment of multi-point configurations in order to perform the camera calibration, render the user-interaction during the calibration process very cumbersome and prone to errors. Further, the number of parameters to be estimated is large, and therefore, the interaction does not provide a very intuitive feedback to the user.
Accordingly, methods for calibrating optical see-through displays that are efficient and user-friendly are highly desirable.
SUMMARY OF THE INVENTION
The present invention is directed to a system and method for calibrating a stereo optical see-through HMD (head-mounted display). A preferred method integrates measurement for an optical see-through HMD and a six degrees of freedom tracker that is fixedly attached to the HMD to perform calibration. Calibration is based on the alignment of a stereoscopically fused marker image, which is perceived in depth, with a single 3D reference point in a world coordinate system from various viewpoints. The user interaction to perform the calibration is extremely easy compared to conventional methods and does not require keeping the head static during the calibration process.
In one aspect, a method for calibrating a stereo HMD comprises the steps of: displaying a 2-dimensional marker image on each eye display of the stereo HMD for view by a user, wherein the 2-dimensional marker images are displayed with an offset so as to induce a 3-dimensional marker that is perceived by the user as being at a distance away from the user; aligning the 3-dimensional marker image with a preselected reference point; collecting calibration data associated with the alignment; and computing a model of the HMD using the collected calibration data.
The distance at which the 3-dimensional marker is perceived by the user is proportional to the offset between the 2-dimensional marker images for both eyes.
The camera model is defined with respect to a coordinate system of a tracker sensor fixedly attached to the HMD. Preferably, the camera model comprises (i) a first projection matrix that defines a transformation between the tracker sensor and a first virtual camera and (ii) a second projection matrix that defines a transformation between tracker sensor and a second virtual camera, wherein the first virtual camera comprises a combination of the left eye of the individual and a left eye display of the HMD and wherein the second virtual camera comprises a combi

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

System and method for calibrating a stereo optical... does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with System and method for calibrating a stereo optical..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and System and method for calibrating a stereo optical... will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-3362018

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.