Process for anticipation and tracking of eye movement

Image analysis – Applications – Target tracking or detecting

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C382S108000, C348S115000, C348S169000, C434S044000

Reexamination Certificate

active

06574352

ABSTRACT:

TECHNICAL FIELD
This invention relates generally to the field of computer graphics projection and display, and more particularly to anticipating a user's eye movement in a projection based, computer graphics simulation system.
BACKGROUND ART
Head tracked area-of-interest image projection systems are used for flight simulators and military flight simulator training. The ideal simulator should have eye limited resolution and an unlimited field of resolution. Many systems provide either high resolution over a narrow field of view or low resolution over a wide field of view because of computing power restrictions and optical imaging limitations. Either choice limits training effectiveness. Recent simulator projector systems have been developed to give a greater field of view combined with a higher resolution image at the user's focal point of interest. Such a device uses a head tracked projector and a compact target projector to form images on a dome within which the user is enclosed. A high-resolution inset provides good imagery for the foveal vision, and the background forms imagery for the peripheral vision. In such a system, only two-image generator channels are needed to cover the entire field of the dome which is very efficient. These devices combine to create a cost effective solution for air-to-air and air-to-ground flight combat training.
The head tracked projector systems display the image a pilot would see out the window of their aircraft by placing high-resolution imagery only where the pilot is looking over a full 360 degree field of regard. To do this, the simulation device requires information about where the pilot is looking. The information is provided by a head tracker attached to the pilot's helmet. This information is used to tell the Image Generator (IG) where the pilot is looking so that the projector port for the image can be oriented properly. The pilot gaze information is also used by the IG and its accompanying projector hardware to control the orienting of projector servos so the image is projected where the pilot is looking.
These activities must be carefully coordinated so the proper IG image is available to the projection hardware at the proper location on the dome surface. When the two processes are not carefully synchronized, the image will appear to slide around on the dome surface. If the information is too slow, the pilot may have already moved their head again so that the image will do appear to be lagging behind where the pilot is looking. The major criticism of head-tracked projector simulators which are currently known in the art is that the image generally lags behind where the pilot is looking.
There are several reasons for this lag. One problem is that the current hardware interfaces between head tracking devices and the IGs (or projectors) are too slow. The time delay between the measurement of and orientation and when the data is available to the IG and projection hardware is too long. Furthermore, the data is noisy, so even when the head tracker is at rest, there appears to be variations in the pilot's head position and orientation.
Many efforts have been made to overcome the disadvantages of the delay that currently exists in head tracked projectors. One method to decrease the lag is to increase the actual communication speed of the hardware. In another method, designers have been able to reduce some of the delay by double buffering the head tracker data so that it can be accessed asynchronously using the high speed communication hardware. These approaches reduce the lag but current hardware solutions have not been able to reduce the lag sufficiently.
Another possible method to minimize the delay in the delivery of head movement data is to try to anticipate a user's head motion which would help stabilize the image. Of course there is no way to truly anticipate where a pilot is going to look, but information based on a user's head motion can be used to try to anticipate where the pilot will be looking in the future. Experiments have been performed to discover a relationship between head motion and eye motion. Unfortunately, this type of research has not yielded any recognizable patterns of motion that are useful.
Other hardware solutions to overcome this problem have not been successful either. For example, a prototype head tracker using Helmholtz coils to measure the head orientation has been used but this approach is expensive and technically complex. Reduced field of rotation optical head trackers have also been experimented with but these have a limited volume where measurements can occur.
Extrapolating head movement using curve fitting, according to where the head is expected to be, does not create a good solution because noise in the head tracker makes extrapolation of the data unreliable. It is very difficult, if not impossible, to use the data from the head tracker to anticipate head motion. Accordingly, it would be an advantage over the state of the art to provide a method for anticipating and tracking eye movement which produces an image which does not appear to lag behind the user's vision or slide around on the projection surface.
OBJECTS AND SUMMARY OF THE INVENTION
It is an object of the present invention to provide a method for anticipation and tracking of a user's eye movement in a head tracked projector based simulator.
It is another object of the present invention to provide a method for anticipation and tracking of eye movement that establishes a relationship between a user's measured head orientation and where the user's eyes are actually looking.
It is another object of the present invention to provide a method for anticipation and tracking of eye movement where the image is more stable relative to the user's actual view point.
It is another object of the present invention to provide a method for anticipating and tracking a user's eye movement through using a database of stored calibration values.
It is another object of the present invention to provide a method for anticipating and tracking a user's eye movement using a database of stored calibration values to correct an interpolated view point and to determine where the image generator will project an image.
It is yet another object of the present invention to provide a method for anticipating and tracking a user's eye movement using a projection screen divided into spherical triangular regions which are stored in a database with calibration values.
The presently preferred embodiment of the present invention is a method for anticipating and tracking a user's eye movement which utilizes a head tracked area-of-interest image projection system having a head tracked projector. A dome surface is provided which is logically divided into a mesh of spherical triangles upon which the head tracked projector projects computer generated images. A database is provided in which calibration values are stored for each vertex of the spherical triangles, which represent the value of the difference between the user's head orientation and where the user is actually looking for a given head orientation. The user's view point is then interpolated onto the dome surface from the user's head orientation to determine where the image is that the user is viewing on the spherical dome. Next the mesh of spherical triangles is searched to find a selected triangle which contains the view point. This is done by traversing a linked list of the spherical triangles.
Once a spherical triangle is found that contains the interpolated view point, the selected triangle containing the view point is divided into three sub-triangles where the view point defines the common vertex for each of the sub-triangles. Then the surface area of each sub-triangle is calculated and the ratio of each sub-triangle's surface area to the selected triangle's surface area is found. The sub-triangles' surface area ratios are then multiplied by the calibration values associated with the vertex opposite the given sub-triangl

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Process for anticipation and tracking of eye movement does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Process for anticipation and tracking of eye movement, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Process for anticipation and tracking of eye movement will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-3115244

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.