Non-command, visual interaction system for watchstations

Data processing: measuring – calibrating – or testing – Measurement system

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C702S150000, C359S630000, C345S007000, C345S008000, C345S629000

Reexamination Certificate

active

06401050

ABSTRACT:

STATEMENT OF GOVERNMENT INTEREST
The invention described herein may be manufactured and used by or for the Government of the United States of America for governmental purposes without the payment of any royalties thereon or therefore.
BACKGROUND OF THE INVENTION
(1) Field of the Invention
The present invention relates generally to the field of computer operation and in particular to eye-tracking systems and visually operated devices.
(2) Description of the Prior Art
The current viewing screens on Navy watchstations use conventional man machine interfaces (MMI) for input devices. These consist of keyboards, trackballs or mouse, touch screens, and special purpose keys and knobs (i.e. Variable Function Keys VFK). All of the current MMI input devices use haptic functions that require the operator to overtly manipulate the system. For example, to operate the viewing screen, the operator must “hook” or move a system generated cursor to a screen position and physically input the current area of interest. There is no current way for automatic sensing by the computer watchstation of what the operator is seeing.
With current research into the predictive nature of human decision making, non-invasive eye-tracking and computer intelligent agents, the non-command, non-haptic system interpretation of operator intent as actions are now feasible. Cueing of screen objects can be accomplished by the computer system based on this interpretation. The operator will only be cued to change his actions when it is needed by the system or necessary to the proper functioning of the system. Cueing of objects would not be needed if the operator is “aware” of the objects by his gazing at them with sufficient time (dwell) and frequency (scan). The use of non-invasive operator eye-tracking and pupil changes as a method of interpreting the operator's actions, attention and vigilance reduces the necessity of overt operator action with the computer workstation. The current eye-tracking devices use non-invasive camera systems to accurately project the eyes' gaze, pupil size, scan paths and object gaze dwell time. What is needed is a watchstation with the ability to interpret the operator's visual attention and location through such non-invasive devices. This non-command method will allow the tactical system to provide cueing to the operator when insufficient attention has been provided to a possible area of threat or when the system sensors detect an emergency situation.
SUMMARY OF THE INVENTION
Accordingly, it is an object of the present invention to provide a non-command, visual interaction system which has display changes and system operation based on the operator's visual attention.
Another object of the present invention is to provide a non-command, visual interaction system which monitors the operator's visual scan, gaze position, gaze dwell time, blink rate, and pupil size.
Still another object of the present invention is to provide a non-command, visual interaction system which includes a cueing algorithm to alert the operator to threats or emergencies requiring attention.
Other objects and advantages of the present invention will become more obvious hereinafter in the specification and drawings.
In accordance with the present invention, a non-command, visual interaction system is provided which comprises an operator tracking system, a visual capture system and a computer watchstation. The operator tracking comprises an overhead infrared (IR) tracker which tracks a headband-mounted source (worn by the operator), and head tracker hardware which connects to and receives data signals from the IR head tracker. The head tracker hardware provides operator presence and head location to the watchstation computer. The visual capture system comprises an eye-tracker camera and eye-tracking hardware. The eye-tracker hardware receives location data from the watchstation computer for gross aiming of the eye-tracking camera and uses camera imaging for fine resolution. Eye-data signals including visual scan, gaze position, gaze dwell time, blink rate and pupil size are captured by the eye-tracker camera and sent to the watchstation computer. An algorithm provides a sequence of steps to determine when a new object on the watchstation monitor has been seen by the operator and whether an operator cue is required.


REFERENCES:
patent: 4798214 (1989-01-01), Haas
patent: 5341181 (1994-08-01), Godard
patent: 5583795 (1996-12-01), Smyth
patent: 5590268 (1996-12-01), Doi et al

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Non-command, visual interaction system for watchstations does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Non-command, visual interaction system for watchstations, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Non-command, visual interaction system for watchstations will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-2980389

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.