Method and system for relevance feedback through gaze...

Computer graphics processing and selective visual display system – Display driving control circuitry – Controlling the condition of display elements

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C345S215000, C345S215000

Reexamination Certificate

active

06577329

ABSTRACT:

BACKGROUND OF THE INVENTION
The present invention generally relates to a display, and more particularly to a display with eye-tracking for judging relevance, to a user, of information being displayed on the display in a “ticker-like” fashion, and a method therefor.
DESCRIPTION OF THE RELATED ART
With the explosive growth of digital cyberspace, “ticker-like” displays have recently emerged as one of the dominant user interfaces for Internet-based information systems.
Ticker displays are particularly useful for displaying highlights for a large variety of frequently updated information, such as stock quotes, sports scores, traffic reports, headline news, etc.
Typically, a ticker display uses an area of a computer screen to scroll text and/or image items continuously in a predetermined pattern (e.g., from left to right, from bottom to top, etc.). These are widely used by webcasting or Internet push systems, such as PointCast. Further, ticker displays have also been deployed by many popular websites such as MyYahoo, ESPN Sports Zone, and ABC News for similar purposes.
However, currently there is no way of knowing whether the user is viewing the ticker display, or whether the information being displayed by the ticker display has relevance to the user, and there is no way for advertisers and the like to capitalize on or change the information being displayed.
SUMMARY OF THE INVENTION
In view of the foregoing and other problems of the conventional systems and methods, it is an object of the present invention to provide a system for viewing information and data which is user-interactive.
Another object of the invention is to provide a system which integrates eye-tracking technology with ticker-like interfaces to significantly enhance the usability of the ticker interface.
Yet another object of the invention is to provide a system for generating relevance feedback which is helpful in improving the quality of Internet-based information systems.
A further object is to provide a system which determines whether the user is viewing the information being displayed, and, if so, determining a level of relevance to the user of the information being displayed by the ticker display.
In a first aspect of the present invention, a method for interactively displaying information, includes a display for displaying items having different views, a tracker for tracking a user's eye movements while observing a first view of information on the display, and a mechanism, based on an output, from the tracker, for determining whether a current view has relevance to the user.
In a second aspect of the present invention, a method for interactively displaying information, includes displaying items having different views, tracking a user's eye movements while observing a first view of information on the display, and, based on the tracking, determining whether a current view has relevance to the user.
With the unique and unobvious aspects and features of the present invention, the system can determine whether the user is viewing the ticker display, and, if so, whether the information being displayed by the ticker display has relevance to the user. Further, the present invention provides a hassle-free way of providing relevance feedback for the information being displayed and can automatically change the amount and type of information being displayed to a particular user. Thus, the system is adaptive to the user's interests and preferences, and thereby advertisers and the like can capitalize on or change the information being displayed to the particular user.


REFERENCES:
patent: 4950069 (1990-08-01), Hutchinson
patent: 5016282 (1991-05-01), Tomono et al.
patent: 5507291 (1996-04-01), Stirbl et al.
patent: 5572596 (1996-11-01), Wildes et al.
patent: 5649061 (1997-07-01), Smyth
patent: 5762611 (1998-06-01), Lewis et al.
patent: 5825355 (1998-10-01), Palmer et al.
patent: 5886683 (1999-03-01), Tognazzini et al.
patent: 5898423 (1999-04-01), Tognazzini et al.
patent: 5920477 (1999-07-01), Hoffberg et al.
patent: 5959621 (1999-09-01), Nawaz et al.
patent: 5987415 (1999-11-01), Breese et al.
patent: 6067565 (2000-05-01), Horvitz
patent: 6134644 (2000-10-01), Mayuzumi et al.
patent: 6182098 (2001-01-01), Selker
patent: 6185534 (2001-02-01), Breese et al.
patent: 6212502 (2001-04-01), Ball et al.
patent: 6351273 (2002-02-01), Lemelson et al.
Johnmarshall Reeve, “The Face of Interest”, Motivation and Emotion, vol. 17, No. 4, 1993, pp. 353-375 12 Pages Total.
Johnmarshall Reeve and Glen Nix, “Expressing Intrinsic Motivation Through Acts of Exploration and Facial Displays of Interest”, Motivation and Emotion, vol. 21, No. 3, 1997, pp. 237-250, 8 Pages Total.
H. Rex Hartson and Deborah Hix, “Advances in Human-Computer Interaction”, vol. 4, Virginia Polytechnic Institute and State University, ABLEX Publishing Corporation, Norwood, New Jersey, pp. 151-190, May 1993.
Paul P. Maglio, Rob Barrett, Christopher S. Campbell, Ted Selker, IBM Almaden Research Center, San Jose, CA., “Suitor: An Attentive Information System”, IUI2000: The International Conference on Intelligent User Interfaces, pp. 1-8.
Erik D. Reichle, Alexander Pollatsek, Donald L. Fisher, and Keith Rayner, University of Massachusetts at Amherst, “Toward a Model of Eye Movement Control in Reading”, Psychological Review 1998; vol. 105, No. 1, pp. 125-157.
Robert J.K. Jacob, Human-Computer Interaction Lab, Naval Research Laboratory, Washington, D.C., “What You Look at is What You Get: Eye Movement-Based Interaction Techniques”, CHI '90 Proceedings, Apr. 1990, pp. 11-18.
Lisetti, et al, “An Environment to Acknowledge the Interface between Affect and Cognition”, pp. 78-86.
Black et al., “Recognizing Facial Expressions in Image Sequences using Local Parameterized Models of Image Motion”, pp. 1-35, Mar. 1995.
Lien, et al, “Automatically Recognizing Facial Expressions in the Spatio-Temporal Domain”, Oct. 19-21, 1997, Workshop on Perceptual User Interfaces, pp. 94-97, Banff, Alberta, Canada.
Lien, et al., “Subtly Different Facial Expression Recognition And Expression Intensity Estimation”, Jun. 1998, IEEE, Published in the Proceedings of CVPR'98, Santa Barbara, CA.
Lien, et al, “Automated Facial Expression Recognition Based on FACS Action Units”, Apr. 14-16, 1998, IEEE, Published in the Proceedings of FG '98, Nara, Japan.
Morimoto, et al., “Pupil Detection and Tracking Using Multiple Light Sources”.
Ebisawa, et al, “Examination of Eye-Gaze Detection Technique Using Two Light Sources and the Image Difference Method”, SICE '94, Jul. 26-28, 1994.
Ebisawa, Y., “Improved Video-Based Eye-Gaze Detection Method”, IMTC '94 May 10-12, 1994.
Ohtani, et al., “Eye-Gaze Detection Based on the Pupil Detection Technique Using Two Light Sources and the Image Difference Method”, Sep. 20-23, 1995, pp. 1623-1624.
Y. Ebisawa, “Unconstrained pupil detection technique using two light sources and the image difference method”, Visualization and Intelligent Design in Engineering, pp. 79-89, 1995.
Kumakura, S., “Apparatus for estimating the drowsiness level of a vehicle driver”, Jul. 28, 1998, pp. 1-2.
Eriksson, M., “Eye-Tracking for Detection of Driver Fatigue”, IEEE Conference on Intelligent Transportation Systems, Nov. 9-12, pp. 314-319.
Funada, et al., “On an Image Processing of Eye Blinking to Monitor Awakening Levels of Human Beings”, IEEE Conference, pp. 966-967., 1996.
Kamitani, et al., “Analysis of perplex situations in word processor work using facial image sequence”, SPIE vol. 3016, pp. 324-334., 1997.
Tomono, et al., “A TV Camera System which Extracts Feature Points for Non-Contact Eye Movement Detection”, SPIE vol. 1194 Optics, Illumination, and Image Sensing for Machine Vision IV (1989), pp. 2-12.
Pantic, et al., “Automation of Non-Verbal Communication of Facial Expressions”, pp. 86-93.
Pantic, et al., “Automated Facial Expression analysis”, pp. 194-200.
Morimoto, et al., “Recognition of Head Gestures Using Hidden Markov Models”.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Method and system for relevance feedback through gaze... does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Method and system for relevance feedback through gaze..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Method and system for relevance feedback through gaze... will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-3136384

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.