Methods and interfaces for event timeline and logs of video...

Data processing: presentation processing of document – operator i – Operator interface – On screen video or audio system interface

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C715S719000, C348S143000

Reexamination Certificate

active

07996771

ABSTRACT:
Techniques for generating timelines and event logs from one or more fixed-position cameras based on the identification of activity in the video are presented. Various embodiments of the invention include an assessment of the importance of the activity, the creation of a timeline identifying events of interest, and interaction techniques for seeing more details of an event or alternate views of the video. In one embodiment, motion detection is used to determine activity in one or more synchronized video streams. In another embodiment, events are determined based on periods of activity and assigned importance assessments based on the activity, important locations in the video streams, and events from other sensors. In different embodiments, the interface consists of a timeline, event log, and map.

REFERENCES:
patent: 5136655 (1992-08-01), Bronson
patent: 5655058 (1997-08-01), Balasubramanian et al.
patent: 5680558 (1997-10-01), Hatanaka et al.
patent: 5708767 (1998-01-01), Yeo et al.
patent: 6366296 (2002-04-01), Boreczky et al.
patent: 6535639 (2003-03-01), Uchihachi et al.
patent: 6570608 (2003-05-01), Tserng
patent: 6807361 (2004-10-01), Girgensohn et al.
patent: 7143083 (2006-11-01), Carlbom et al.
patent: 7221366 (2007-05-01), Uyttendaele
patent: 2003/0025599 (2003-02-01), Monroe
patent: 2003/0044045 (2003-03-01), Schoepflin
patent: 2003/0090505 (2003-05-01), McGee et al.
patent: 2003/0161396 (2003-08-01), Foote et al.
patent: 2003/0189588 (2003-10-01), Girgensohn et al.
patent: 2003/0197731 (2003-10-01), Chiu et al.
patent: 2003/0234803 (2003-12-01), Toyama et al.
patent: 2004/0119819 (2004-06-01), Aggarwal et al.
patent: 2004/0240542 (2004-12-01), Yeredor et al.
patent: 2005/0122397 (2005-06-01), Henson et al.
patent: 2005/0132414 (2005-06-01), Bentley et al.
patent: 2005/0163346 (2005-07-01), van den Bergen et al.
Demirdjian et al., “Activity Maps for Location-Aware Computing,” In Proceedings of IEEE Workshop on Applications of Computer Vision (WACV2002), Orlando, Florida (Dec. 2002).
Larson et al., “An exploratory look at supermarket shopping paths,” The Wharton School, The University of Pennsylvania, Philadelphia, http://www.searchlores.org/realicra/PT—1006.pdf (Dec. 9, 2005).
Pingali et al., “Multimedia Retrieval through Spatio-Temperal Activity Maps,” Proceedings of the ACM Multimedia, pp. 129-136, Ottawa (Sep. 2001).
Porikli, F., “Multi-Camera Surveillance: Object-Based Summarization Approach,” Mitsubishi Electric Research Laboratories, Inc., https://www.merl.com/reports/docs/TR2003-145.pdf (Mar. 2004).
Santini, S., “Analysis of traffic flow in urban areas using web cameras” Fifth IEEE Workshop on Applications of Computer Vision (WACV 2000) Palms Spring, CA (Dec. 2000).
Stauffer and Grimson, “Learning Patterns of Activity Using Real-Time Tracking, ”IEEE Trans. Pattern Anal. Mach. Intell. 22(8): 747-757 (2000), http://people.csail.mit.edu/welg/papers/learning2000.pdf.
Xiang and Gong, “Activity based video content trajectory representation and segmentation,” In Proc. British Machine Vision Conference (BMVC), pp. 177-186, Kingston, U.K, (Sep. 2004), http://www .dcs.qmul.ac.uk/˜txiang/xiang—gong—bmvc04—segment—camera—ready.pdf.
“Tag Team: Tracking the Patterns of Supermarket Shoppers,” Knowledge@Wharton, The Wharton School of the University of Pennsylvania (Jun. 1, 2005). http://knowledge.wharton.upenn.edu/articlepdf/1208.pdf?CFID=36967030&CFTOKEN=4134849&jsessionid=9a3085e52c58255c797c.
Topic 6 “Analyxing InStore Shopping Patterns,” Map Analysis, http://www.innovativegis.com/basis/MapAnalysis/Topic6/Topic6.pdf (accessed Sep. 11, 2007).
Kumar V., et al., “Metadata Visualization for Digital Libraries: Interactive Timeline Editing and Review.” Proceedings of the 3rdACM Conference on Digital Libraries, pp. 126-133 (1998).
Yeung, M., et al., “Video Visualization for Compact Presentation and Fast Browsing of Pictorial Content,” IEEE Trans. Circuits and Sys. For Video Tech., vol. 7, No. 5 pp. 771-785 (Oct. 1997).
Boreczky, J., et al., “An Interactive Comic Book Presentation for Exploring Video,” FX Palo Alto Laboratory, Inc. (1999).
Cheung, S., et al., “Robust Techniques for Background Subtraction in Urban Traffic Video,” Center for Applied Scientific Computing (2004).
Girgensohn A., et al., “A Semi-Automatic Approach to Home Video Editing.” In Proceedings of UIST '00, ACM Press, pp. 81-89 (2000).
Plaisant, C., et al., “Lifelines: Visualizing Personal Histories,” University of Maryland, http://www.cs.umd.edu/projects/hcil (at least as early as Jun. 14, 2005).
Wildemuth, B., “How Fast is Too Fast? Evaluating Fast Forward Surrogates for Digital Video,” Interaction Design Laboratory, Proceedings of the 3rdACM/IEEE-CS Joint Conference on Digital Libraries, University of North Carolina at Chapel Hill, pp. 221-230 (2003).
Zivkovic, Z., “Improved Adaptive Gaussian Mixture Model for Background Subtraction,” Intelligent and Autonomous Systems Group, University of Amsterdam, The Netherlands (2004).

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Methods and interfaces for event timeline and logs of video... does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Methods and interfaces for event timeline and logs of video..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Methods and interfaces for event timeline and logs of video... will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-2650084

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.