Method and system for indexing and searching objects of...

Television – Camera – system and detail – Combined image signal generator and general image signal...

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C348S231300

Reexamination Certificate

active

07898576

ABSTRACT:
A seed search of a subset of analytical data corresponding to video objects displayable in a plurality of video frames is carried out to identify video objects that most closely match a selected video object and then complete searches of the analytical data may be carried out so as to identify video objects that most closely match each video object identified during the seed search. The video objects having the greatest number of occurrences of being identified during the complete searches may be displayed by a graphical user interface (GUI). In this way, the GUI may display the video objects in an order based on how closely each video object matches the selected video object and/or a video object identified during the seed search, which may an order different than an order based on a time when each video object was captured.

REFERENCES:
patent: 5568192 (1996-10-01), Hannah
patent: 6954544 (2005-10-01), Jepson et al.
patent: 7636450 (2009-12-01), Bourdev
patent: 2007/0047811 (2007-03-01), Itoh et al.
patent: 2007/0092110 (2007-04-01), Xu et al.
patent: 2007/0098303 (2007-05-01), Gallagher et al.
patent: 2008/0226127 (2008-09-01), Brodsky et al.
patent: 2009/0034791 (2009-02-01), Doretto et al.
patent: 2009/0169168 (2009-07-01), Ishikawa
patent: 2009/0175502 (2009-07-01), Bushell et al.
patent: 0805405 (1997-11-01), None
patent: 1184810 (2002-03-01), None
patent: WO-03067884 (2003-08-01), None
patent: WO-2006097680 (2006-09-01), None
patent: WO-2006097681 (2006-09-01), None
Porikli F. et al.: “Covariance Tracking using Model Update Based on Lie Algebra” Computer Vision and Pattern Recognition, 2006 IEEE Computer Society Conference On New York, NY, USA Jun. 17-22, 2006, Piscataway, NJ, USA IEEE, vol. 1, Jun. 17, 2006, pp. 728-735. ISBN: 978-0-7695-2597-6. Entire Document.
International Search Report for International Application No. PCT/US2008/055245, mailed Feb. 17, 2009.
U.S. Appl. No. 11/680,347, filed Feb. 28, 2007.
RFC Services, Visual Hindsight IP Video Surveillance, downloaded from the World Wide Web at http://www.visualhindsight.com/press—kit.htm on May 14, 2007.
Video Analytics, Video Analytics Terminology, downloaded from the World Wide Web at http://www.videoanalytics.org/pages/terminology.htm on May 4, 2007.
“U.S. Appl. No. 11/680,347, Non-Final Office Action mailed May 24, 2010”, 8 pgs.
“U.S. Appl. No. 11/680,347, Response filed Jul. 21, 2010 to Non Final Office Action mailed May 24, 2010”, 12 pgs.
“U.S. Appl. No. 11/680,347, Response filed Jul. 21, 2010 to Non-Final Office Action w/ Restriction Requirement mailed May 24, 2010”, 13 pgs.
“International Application No. PCT/US2008/055245, Written Opinion mailed Feb. 17, 2009”, 9 pgs.
Porikli, F., et al., “Covariance Tracking using Model Update Based on Lie Algebra”, Proceedings of the 2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, IEEE Computer Conference on New York, vol. 1. No. 1, (Jun. 2006), 728-735 pgs.
“U.S. Appl. No. 11/680,347 Final Office Action mailed Aug. 13, 2010”, 7 pgs.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Method and system for indexing and searching objects of... does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Method and system for indexing and searching objects of..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Method and system for indexing and searching objects of... will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-2682562

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.