Passive gaze-driven browsing

Computer graphics processing and selective visual display system – Display peripheral interface input device

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C345S007000, C345S008000, C345S157000, C345S158000, C345S172000, C345S520000, C351S210000, C351S211000, C351S212000, C351S222000

Reexamination Certificate

active

06608615

ABSTRACT:

BACKGROUND
1. Field
The present invention relates generally to human computer interfaces (HCls) and, more specifically, to interaction between a user's eyes and a computer display.
2. Descriptioon
Robust tracking of facial features may become fundamental to future human computer interaction. Reliable techniques for detecting movement of a user's lips and eyes are examples. The requirement for real-time interaction of the user with a computer imposes severe constraints on the response time of these image processing systems, which are also known to have high computational demands. Most current research on real-time detection and tracking of facial features are typically model-based, i.e., they use information about skin color or face geometry, for example. Some developed systems are used in command and control systems that are active and intrusive. Other research explores the physical properties of eyes (such as their retro-reflectivity) to passively track the eyes using an active illumination scheme. Utilization of eye properties have been used in several commercial eye gaze trackers, such as those available from ISCAN Incorporated, Applied Science Laboratories (ASL), and LC Technologies, for example. However, these gaze trackers typically use only bright or dark pupil images for tracking.
Due to the retro-reflectivity and geometry of the eye, a camera sees a bright pupil image when a light source is placed very close to its optical axis. This effect is well known as the red-eye effect from flash photography. Under regular illumination (when the light source is not on the camera's optical axis), a dark pupil is seen. One technique for robust pupil detection is to use active illumination systems to generate dark and bright pupil images. Pupil candidates are detected from the thresholded difference of the dark pupil image and the bright pupil image.
Some pupil detection systems are based on this differential lighting with thresholding scheme. These systems are used to detect and track the pupil and estimate the point of gaze, which also requires the detection of corneal reflections created by the light sources. The corneal reflection from the light sources can be seen as a bright spot close to the pupils (corneal glint). The point of gaze may be virtually extended to pass through the plane of a computer display surface so that screen coordinates may be identified for a current user gaze. Once the point of gaze of a user may be estimated, new methods of human computer interaction may be developed.


REFERENCES:
patent: 5220361 (1993-06-01), Lehmer et al.
patent: 5898423 (1999-04-01), Tognazzini et al.
patent: 6152563 (2000-11-01), Hutchinson et al.
patent: 6246779 (2001-06-01), Fukui et al.
patent: 6282553 (2001-08-01), Flickner et al.
patent: 6351273 (2002-02-01), Lemelson et al.
C.H. Morimoto; D. Koons; A. Amir; M. Flickner Frame-Rate Pupil Detector and Gaze Tracker, pp. 1-5, IBM Almaden Research Center San Jose, CA USA.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Passive gaze-driven browsing does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Passive gaze-driven browsing, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Passive gaze-driven browsing will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-3116738

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.