Methods and systems for video content browsing

Data processing: presentation processing of document – operator i – Operator interface – On screen video or audio system interface

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

Reexamination Certificate

active

07552387

ABSTRACT:
Methods and systems for browsing video content are described. Video content is accessed. Metadata from a content analysis that is performed on said video content is generated. A portion of said video content based on said generated metadata is presented for display.

REFERENCES:
patent: 5353391 (1994-10-01), Cohen et al.
patent: 2002/0069218 (2002-06-01), Sull et al.
patent: 2003/0033147 (2003-02-01), McCartney et al.
patent: 2003/0137522 (2003-07-01), Kaasila et al.
patent: 2003/0195882 (2003-10-01), Lee et al.
patent: 2004/0015490 (2004-01-01), Snyder et al.
patent: 2004/0125124 (2004-07-01), Kim et al.
patent: 0915471 (1999-05-01), None
patent: 1109111 (2001-06-01), None
patent: 1132835 (2001-09-01), None
patent: WO98/52356 (1998-11-01), None
patent: WO01/41451 (2001-06-01), None
Aiken et al., Microsoft Computer Dictionary, Fifth Edition, Microsoft Press (2002, p. 336).
Zhong, Di et al—“Clustering Methods for Video Browsing and Annotation”—Proceedings of SPIE vol. 2670—Feb. 1, 1996—pp. 239-246.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Methods and systems for video content browsing does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Methods and systems for video content browsing, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Methods and systems for video content browsing will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-4075920

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.