Segmentation unit for and method of determining a second...

Image analysis – Applications – Motion or velocity measuring

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C382S103000, C382S106000, C382S236000

Reexamination Certificate

active

07120277

ABSTRACT:
A method of determining segments in a series of images based on previous segmentation results and on motion estimation. A second segment (108) of a second image (106) is determined based on a first segment (102) of a first image (100), with the first segment (102) and the second segment (108) corresponding to one object (104). The method comprises a calculation step to compare values of pixels of the first image (100) and the second image (106) in order to calculate a motion model which defines the transformation of the first segment (102) into the second segment (108). An example of such a motion model allows only translation. Translation can be described with one motion vector (110).

REFERENCES:
patent: 4862260 (1989-08-01), Harradine et al.
patent: 5410358 (1995-04-01), Shackleton et al.
patent: 5646691 (1997-07-01), Yokoyama
patent: 5991428 (1999-11-01), Taniguchi
patent: 6525765 (2003-02-01), Brett et al.
patent: 6560372 (2003-05-01), Kadono
patent: 6809758 (2004-10-01), Jones
patent: 2003/0072482 (2003-04-01), Brand

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Segmentation unit for and method of determining a second... does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Segmentation unit for and method of determining a second..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Segmentation unit for and method of determining a second... will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-3713055

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.