Method and apparatus for approximating a contour image of an obj

Image analysis – Image compression or coding – Contour or chain coding

Patent

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

382199, G06K 900

Patent

active

058287906

ABSTRACT:
A method for approximating a contour of an object expressed in a digital video signal divides the contour into a multiplicity of primary contour segments and approximates each primary contour segment by a primary line segment, to thereby calculate a set of errors between the primary contour segment and the primary line segment for each primary contour segment. And the method codes and decodes the set of errors, to thereby generate of a set of reconstructed errors and a reconstructed contour segment which are used to determine a reconstruction error. Thereafter, the method generates one or more secondary contour segment on the primary contour segment with each secondary contour segments approximated by a secondary line segment. Subsequently, the method finds an approximation error and approximates said each primary contour segment by using either the reconstructed contour segment or the secondary line segments based on the reconstruction error and the approximation error.

REFERENCES:
patent: 5635986 (1997-06-01), Kim
patent: 5737449 (1998-04-01), Lee

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Method and apparatus for approximating a contour image of an obj does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Method and apparatus for approximating a contour image of an obj, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Method and apparatus for approximating a contour image of an obj will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-1620610

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.