Returning a new content based on a person's reaction to at...

Data processing: database and file management or data structures – Database and file access – Query optimization

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C707S752000

Reexamination Certificate

active

08001108

ABSTRACT:
Embodiments provide a device, apparatus, system, computer program product, and method. A provided method includes receiving information that is indicative of respective responses by a person to each of at least two instances of electronically displayed content. The received information is derived from data acquired by a sensor coupled to the person and sent by a requestor electronic device. The method also includes selecting a particular content from the at least two instances of electronically displayed content. The selecting is based at least in part on the received information. The method further includes facilitating a search for a new content using a search parameter corresponding to a content attribute of the particular content. The method also includes returning an indication of the new content to the requestor electronic device.

REFERENCES:
patent: 6118888 (2000-09-01), Chino et al.
patent: 6401050 (2002-06-01), Cooke et al.
patent: 6651045 (2003-11-01), Macaulay
patent: 6847992 (2005-01-01), Haitsuka et al.
patent: 7100818 (2006-09-01), Swaine
patent: 7228327 (2007-06-01), Shuster
patent: 7495659 (2009-02-01), Marriott et al.
patent: 2002/0030163 (2002-03-01), Zhang
patent: 2002/0059370 (2002-05-01), Shuster
patent: 2002/0127623 (2002-09-01), Minshull et al.
patent: 2002/0139842 (2002-10-01), Swaine
patent: 2003/0088463 (2003-05-01), Kanevsky et al.
patent: 2003/0220835 (2003-11-01), Barnes, Jr.
patent: 2004/0148572 (2004-07-01), Nakanishi et al.
patent: 2004/0193488 (2004-09-01), Khoo et al.
patent: 2005/0013104 (2005-01-01), Feague et al.
patent: 2005/0108092 (2005-05-01), Campbell et al.
patent: 2005/0172319 (2005-08-01), Reichardt et al.
patent: 2005/0235338 (2005-10-01), AbiEzzi et al.
patent: 2006/0085818 (2006-04-01), Bodlaender et al.
patent: 2006/0133586 (2006-06-01), Kasai et al.
patent: 2006/0143647 (2006-06-01), Bill
patent: 2006/0179044 (2006-08-01), Rosenberg
patent: 2006/0195441 (2006-08-01), Julia et al.
patent: 2007/0061753 (2007-03-01), Ng et al.
patent: 2007/0066323 (2007-03-01), Park et al.
patent: 2007/0168413 (2007-07-01), Barletta et al.
patent: 2007/0214471 (2007-09-01), Rosenberg
patent: 2007/0220010 (2007-09-01), Ertugrul
patent: 2007/0220040 (2007-09-01), Do
patent: 2007/0265090 (2007-11-01), Barsness et al.
patent: 2007/0287415 (2007-12-01), Yamada
patent: 2007/0293731 (2007-12-01), Downs et al.
patent: 2007/0294064 (2007-12-01), Shuster
patent: 2008/0004989 (2008-01-01), Yi
patent: 2008/0052219 (2008-02-01), Sandholm et al.
patent: 2008/0065468 (2008-03-01), Berg et al.
patent: 2008/0146892 (2008-06-01), LeBoeuf et al.
patent: 2008/0275700 (2008-11-01), Bingley et al.
patent: 2008/0306913 (2008-12-01), Newman et al.
patent: 2008/0313033 (2008-12-01), Guo et al.
patent: 2009/0018911 (2009-01-01), An Chang et al.
patent: 2009/0030978 (2009-01-01), Johnson et al.
patent: 2009/0076887 (2009-03-01), Spivack et al.
patent: 2009/0089678 (2009-04-01), Sacco et al.
patent: 2009/0132368 (2009-05-01), Cotter et al.
patent: 2009/0138565 (2009-05-01), Shiff et al.
patent: 2009/0216744 (2009-08-01), Shriwas et al.
patent: 2010/0122178 (2010-05-01), Konig et al.
patent: 2010/0250513 (2010-09-01), Guha
patent: WO 2007/088536 (2007-08-01), None
U.S. Appl. No. 11/998,779, Jung et al.
U.S. Appl. No. 11/998,826, Jung et al.
U.S. Appl. No. 11/998,820, Jung et al.
U.S. Appl. No. 12/001,759, Jung et al.
U.S. Appl. No. 12/006,792, Jung et al.
U.S. Appl. No. 12/006,793, Jung et al.
U.S. Appl. No. 12/011,031, Jung et al.
“Eye Gaze Tracking”; ISL eye gaze tracking; pp. 1-3; printed on Sep. 19, 2007; located at http://www.is.cs.cmu.edu/mie/eyegaze.html.
“Eye tracking”; Wikipedia.com; bearing dates of Dec. 2006 and Sep. 13, 2007; pp. 1-5; Wikimedia Foundation, Inc.; USA; printed on Sep. 19, 2007; located at http://en.wikipedia.org/wiki/Eye—tracking.
“Happy, sad, angry or astonished?”; Physorg.com; Jul. 3, 2007; pp. 1-2; Physorg.com; printed on Sep. 19, 2007; located at http://www.physorg.com
ews102696772.html.
Kim, Kyung-Nam; Ramakrishna, R.S.; “Vision-Based Eye-Gaze Tracking for Human Computer Interface”; IEEE; 1999; pp. 324-329; IEEE.
Mao, Xiaoyang, et al.; “Gaze-Directed Flow Visualization”; Proc. of SPIE-IS&T Electronic Imaging; bearing a date of 2004; pp. 141-150; vol. 5295; SPIE and IS&T.
“MyTobii 2.3 means new power to communicate”; tobii.com; bearing a date of 2006; p. 1; Tobii Technology AB; printed on Sep. 19, 2007; located at http://www.tobii.com//defaultasp?sid=1220.
MyTobii User Manual, Version 2.3; tobii.com; bearing dates of Apr. 2007 and 2006; pp. 1-86; Tobii Technology AB.
Ohshima, Toshikazu et al.; “Gaze-directed Adaptive Rendering for Interacting with Virtual Space” (abstract only); Proceedings of the 1996 Virtual Reality Annual International Symposium (VRAIS 96); 1996; p. 103 (pp. 1-3 provided); ISBN:0-8186-7295-1; IEEE Computer Society; Washington, DC; USA; printed on Oct. 2, 2007; located at http://portal.acm.org/citation.cfm?id=836033&coll=Portal&dl=GUIDE&CFID=15340 99&CFTOKEN=93189133.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Returning a new content based on a person's reaction to at... does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Returning a new content based on a person's reaction to at..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Returning a new content based on a person's reaction to at... will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-2712060

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.