System and process for detecting, tracking and counting...

Image analysis – Applications – Target tracking or detecting

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

Reexamination Certificate

active

07965866

ABSTRACT:
A method of identifying, tracking, and counting human objects of interest based upon at least one pair of stereo image frames taken by at least one image capturing device, comprising the steps of: obtaining said stereo image frames and converting each said stereo image frame to a rectified image frame using calibration data obtained for said at least one image capturing device; generating a disparity map based upon a pair of said rectified image frames; generating a depth map based upon said disparity map and said calibration data; identifying the presence or absence of said objects of interest from said depth map and comparing each of said objects of interest to existing tracks comprising previously identified objects of interest; for each said presence of an object of interest, adding said object of interest to one of said existing tracks if said object of interest matches said one existing track, or creating a new track comprising said object of interest if said object of interest does not match any of said existing tracks; updating each said existing track; and maintaining a count of said objects of interest in a given time period based upon said existing tracks created or modified during said given time period.

REFERENCES:
patent: 4916621 (1990-04-01), Bean et al.
patent: 5973732 (1999-10-01), Guthrie
patent: 6445810 (2002-09-01), Darrell et al.
patent: 6674877 (2004-01-01), Jojic et al.
patent: 6697104 (2004-02-01), Yakobi et al.
patent: 6771818 (2004-08-01), Krumm et al.
patent: 6952496 (2005-10-01), Krumm
patent: 7003136 (2006-02-01), Harville
patent: 7092566 (2006-08-01), Krumm
patent: 7161482 (2007-01-01), Rider et al.
patent: 7176441 (2007-02-01), Sumitomo et al.
patent: 7227893 (2007-06-01), Srinivasa et al.
patent: 7400744 (2008-07-01), Nichani et al.
patent: 2005/0249382 (2005-11-01), Schwab et al.
patent: 2006/0088191 (2006-04-01), Zhang et al.
patent: 2006/0210117 (2006-09-01), Chang et al.
patent: 2008/0285802 (2008-11-01), Bramblet et al.
patent: 2009/0195371 (2009-08-01), Camus
patent: WO 2009/004479 (2009-01-01), None
Republication of WO 2009-004479 A3 with International Search Report, Jan. 8, 2009.
Ismail Haritaoglu et al, W4: Who? When? Where? What? A Real Time System for Detecting and Tracking People, 3. International Conference on Face and Gesture Recognition, Apr. 14-16, 1998, Nara, Japan; pp. 1-6.
Michael Isard et al, Contour Tracking by Stochastic Propagation of Conditional Density, In Prc. European Conf. Computer Vision, 1996, pp. 343-356, Cambridge, UK.
Paolo Remagnino et al; Correlation Techniques in Adaptive Template Matching With Uncalibrated Cameras, Lifia—Inria Rhones-Alpes, Nov. 2, 1994.
Christopher Eveland et al, Background Modeling for Segmentation of Video-Rate Stereo Sequences, Jun. 23-25, 1998.
Christopher Richard Wren et al, Pfinder: Real-Time Tracking of the Human Body, IEEE Transacations on Pattern Analysis and Machine Intelligence, vol. 19, No. 7, Jul. 1997; pp. 780-785.
T, Darrell, et al Integrated person Tracking Using Stereo, Color, and Pattern Direction, pp. 1-8, Jun. 23-25, 1998.
International Search Report dated Feb. 12, 2009.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

System and process for detecting, tracking and counting... does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with System and process for detecting, tracking and counting..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and System and process for detecting, tracking and counting... will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-2730067

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.