Method and apparatus for tracking moving objects in a...

Image analysis – Applications – Target tracking or detecting

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C382S293000

Reexamination Certificate

active

06826292

ABSTRACT:

BACKGROUND OF THE INVENTION
The present invention concerns a system and method for tracking moving objects in a sequence of video images and in particular, a system that represents the moving objects in terms of layers and uses the layered representation to track the objects.
Many methods have been proposed to accurately track moving objects in a sequence of two-dimensional images. Most of these methods can track moving objects only when the motion conforms to predefined conditions. For example, change-based trackers ignore any information concerning the appearance of the object in the image and thus have difficulty dealing with moving objects that overlap or come close to overlapping in the sequence of images. Template-based image tracking systems such as that disclosed in the article by G. Hager et al. entitled “Real-time tracking of image regions with changes in geometry and illumination,” Proceedings. of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 403-410, 1996, typically update only motion. The templates used by these systems can drift off or become attached to other objects of similar appearance. Some template trackers, such as that disclosed in the article by M. J. Black et al. entitled “Tracking and recognizing rigid and non-rigid facial motions using local parametric models of image motion,” Proceedings of the. Fifth International Conference on Computer Vision, ICCV'95, p.p. 374-381 1995 use parametric motion (affine/similarity etc.) to update both the motion and the shape of the template. Because, however, there is no explicit updating of template ownership, drift may still occur. A Multiple-hypothesis tracking method disclosed, for example in an article by I. J. Cox et al. entitled “An efficient implementation of Reid's multiple hypothesis tracking algorithm and its evaluation for the purpose of visual tracking,” EEE Transactions on Pattern Analysis and Machine Intelligence, vol. 18, no. 2, pp. 138-150, February 1996, solves some of these problems but only when the image sequence is processed off-line in a batch mode. In addition, the computational complexity of these algorithms limits their state representations to contain only motion information.
SUMMARY OF THE INVENTION
The present invention is embodied in a system that tracks one or more moving objects in a sequence of video images. The tracking system employs a dynamic layer representation to represent the objects that are being tracked. This tracking system incrementally estimates the layers in the sequence of video images.
According to one aspect of the invention, the system concurrently estimates three components of the dynamic layer representation—layer segmentation, motion, and appearance—over time in a maximum a posteriori (MAP) framework. In order to enforce a global shape constraint and to maintain the layer segmentation over time, the subject invention imposes a prior constraint on parametric segmentation. In addition, the system uses a generalized Expectation-Maximization (EM) algorithm to compute an optimal solution.
According to one aspect of the invention, the system uses an object state that consists of representations of motion, appearance and ownership masks. With an object state represented as a layer, maximum a posteriori (MAP) estimation in a temporally incremental mode is applied to update the object state for tracking.
According to another aspect of the invention, the system applies a constant appearance model across multiple images in the video stream.
According another aspect of the invention, the system employs a parametric representation of the layer ownership.


REFERENCES:
patent: 5103305 (1992-04-01), Watanabe
patent: 5168530 (1992-12-01), Peregrim et al.
patent: 5323470 (1994-06-01), Kara et al.
patent: 5436672 (1995-07-01), Medioni et al.
patent: 5481669 (1996-01-01), Poulton et al.
patent: 5502804 (1996-03-01), Butterfield et al.
patent: 5511153 (1996-04-01), Azarbayejani et al.
patent: 5557684 (1996-09-01), Wang et al.
patent: 5563988 (1996-10-01), Maes et al.
patent: 5627905 (1997-05-01), Sebok et al.
patent: 5629988 (1997-05-01), Burt et al.
patent: 5657402 (1997-08-01), Bender et al.
patent: 5686960 (1997-11-01), Sussman et al.
patent: 5764803 (1998-06-01), Jacquin et al.
patent: 5768447 (1998-06-01), Irani et al.
patent: 5802220 (1998-09-01), Black et al.
patent: 6035067 (2000-03-01), Ponticos
patent: 6049619 (2000-04-01), Anandan et al.
patent: 6205260 (2001-03-01), Crinon et al.
Black, Michael J. and Yacoob, Yaser, “Tracking and Recognizing Rigid and Non-Rigid Facial Motions using Local Parametric Models of Image Motion”, Proc. Fifth Int. Conf. on Computer Vision, Boston, Jun. 1995.
Cox, Ingemar J. and Hingorani, Sunita L., “An Efficient Implementation of Reid's Multiple Hypothesis Tracking Algorithm and Its Evaluation for the Purpose of Visual Tracking”, IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 18, No. 2, pp. 138-50, Feb. 1996.
Hager, Gregory D. and Belhumeur, Peter N., “Real-Time Tracking of Image Regions with Changes in Geometry and Illumination”, Proc. of the IEEE Conf. on Computer Vision and Pattern Recognition, 1996.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Method and apparatus for tracking moving objects in a... does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Method and apparatus for tracking moving objects in a..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Method and apparatus for tracking moving objects in a... will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-3348816

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.