Method for creating and using affective information in a...

Image analysis – Image transformation or preprocessing – Image storage or retrieval

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C348S333050, C715S838000

Reexamination Certificate

active

07933474

ABSTRACT:
An image file for storing a still digital image and metadata related to the still digital image, the image file including digital image data representing the still digital image, and metadata that categorizes the still digital image as an important digital image, wherein the categorization uses a range of levels and the range of levels includes at least three different integer values.

REFERENCES:
patent: 5019975 (1991-05-01), Mukai
patent: 5550646 (1996-08-01), Hassan et al.
patent: 5666215 (1997-09-01), Fredlund et al.
patent: 5760917 (1998-06-01), Sheridan
patent: 5802220 (1998-09-01), Black et al.
patent: 5832464 (1998-11-01), Houvener et al.
patent: 6004061 (1999-12-01), Manico et al.
patent: 6154772 (2000-11-01), Dunn et al.
patent: 6182133 (2001-01-01), Horvitz
patent: 6241668 (2001-06-01), Herzog
patent: 6629104 (2003-09-01), Parulski et al.
patent: 6721952 (2004-04-01), Guedalia et al.
patent: 2002/0013161 (2002-01-01), Schaeffer et al.
patent: 2002/0097894 (2002-07-01), Staas et al.
patent: 2005/0149572 (2005-07-01), Kanai et al.
patent: 2007/0150389 (2007-06-01), Aamodt et al.
patent: 10143680 (1998-05-01), None
“The Sentic Mouse: Developing a tool for Measuring Emotional Valence”, by Dana Kirsch, May 1997, XP-002437625.
“Emotion Recognition in Human-Computer Interaction”, by R. Cowie et al., IEEE Signal Processing Magazine, vol. 18, No. 1, Jan. 2001, pp. 32-80, XP-011089882.
“Digital Still Camera Image File Format Standard (Exchangeable image file format for digital still camras: Exit)”, Version 2.1, Jun. 12, 1998, XP-002224029.
R. W. Picard, Affective Computing, Online, 1995, XP002351025.
R. W. Picard et al., Modeling User Subjectivity In Image Libraries, Proceedings Internation Conference on Image Processing, vol. 2, 1996, pp. 777-780, XP002351026.
R. W. Picard et al., Affective Wearables, IEEE Comput. Soc., Oct. 13, 1997, pp. 90-97, XP010251547.
R. W. Picard et al., Affective Intelligence—The Missing Link?, BT Technology Journal, vol. 14, No. 4, Oct. 1997, pp. 150-160, XP-000722041.
R. W. Picard, Building HAL: Computers That Sense, Recognize, And Respond To Human Emotion, Proceedings of the SPIE—The International Society for Optical Engineering, vol. 4299, 2001, pp. 518-523, XP002351027.
R. W. Picard et al., Toward Machine Emotional Intelligence: Analysis Of Affective Physiological State, IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 23 No. 10, Oct. 2001, pp. 1175-1191, XP002268746.
Rob Roy, DVD From Hollywood Video Rentals, SSBN 0-7928-3366-X, Metro Goldwyn Mayer (MGM) @ 1995.
Dillon et al, Aroused and Immersed: The Psychophysiology of Presence, Goldsmiths College, Universityof London, 2000.
“Looking at Pictures: Affective, facial, visceral, and behavorial reactions”, by Peter J. Lang et al, Psychophysiology, 30 (1993), 261-273.
“FotoFile: A Consumer Multimedia Organization and Retrieval System”, by Allan Kuchinsky et al.
“Facial Expression Recognition using a Dynamic Model and Motion Energy”, by Irfan Essa et al. MIT Media Laboratory Perceptual Computing Section Technical Report No. 307, pp. 1-8.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Method for creating and using affective information in a... does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Method for creating and using affective information in a..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Method for creating and using affective information in a... will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-2701016

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.