Approach for resolving occlusions, splits and merges in...

Image analysis – Image segmentation

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C382S195000

Reexamination Certificate

active

08086036

ABSTRACT:
A solution for resolving an occlusion in a video image which provides an environment in which portions of a video image in which occlusions have occurred may be determined and analyzed to determine the type of occlusion. Furthermore, regions of the video image may be analyzed to determine which object in the occlusion the region belongs to. The determinations and analysis may use such factors as pre-determined attributes of an object, such as color or texture of the object and/or a temporal association of the object, among others.

REFERENCES:
patent: 6542621 (2003-04-01), Brill et al.
patent: 6865289 (2005-03-01), Berestov
patent: 6950123 (2005-09-01), Martins
patent: 7778445 (2010-08-01), Au
patent: 2004/0252763 (2004-12-01), Mertens
patent: 2007/0002058 (2007-01-01), Wittebrood
Senior, Andrew et al. “Appearance Models for Occlusion Handling”. http://www.research.ibm.com/peoplevision/PETS2001.pdf, 2001.
Senior, Andrew. “Tracking with Probabilistic Appearance Models”. http://www.research.ibm.com/peoplevision/PETS2002.pdf, 2002.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Approach for resolving occlusions, splits and merges in... does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Approach for resolving occlusions, splits and merges in..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Approach for resolving occlusions, splits and merges in... will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-4295460

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.