Method and apparatus for merging related image segments

Image analysis – Pattern recognition – Classification

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C382S164000, C382S173000

Reexamination Certificate

active

10692466

ABSTRACT:
One embodiment of the invention relates to a method of merging segments to form supersegments in an image. The image consists of a plurality of segments that are constituent portions of the image. At least one candidate segment(s) and at least one neighboring segment(s) for each candidate segment are identified. An error statistic for each pair, consisting of a candidate segment and a corresponding neighboring segment, is computed. A neighboring segment is determined that results in a smallest error statistic for a given candidate segment. A determination is also made as to whether the smallest error statistic is sufficiently small to merit merging of the corresponding pair of segments. The corresponding pair of segments is merged to create one supersegment. The supersegment is a new segment including all pixels formerly contained in one of the two segments that were merged.

REFERENCES:
patent: 6600786 (2003-07-01), Prakash et al.
patent: 6819782 (2004-11-01), Imagawa et al.
patent: WO 00/77735 (2000-12-01), None

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Method and apparatus for merging related image segments does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Method and apparatus for merging related image segments, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Method and apparatus for merging related image segments will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-3882712

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.