Method for efficient coding of shape descriptor parameters

Image analysis – Image compression or coding – Shape – icon – or feature-based compression

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C382S251000

Reexamination Certificate

active

10203478

ABSTRACT:
A method of representing an object appearing in a still or video image, by processing signals corresponding to the image, comprises deriving a plurality of sets of co-ordinate values representing the shape of the object and quantising the co-ordinate values to derive a coded representation of the shape, and further comprises quantising a first co-ordinate value over a first quantisation range and quantising a smaller co-ordinate value over a smaller range.

REFERENCES:
patent: 5740281 (1998-04-01), Hirai
patent: 0 562 672 (1993-09-01), None
patent: 2 351 826 (2001-01-01), None
F. Mokhtarian et al., “Robust and Efficient Shape Indexing Through Curvature Scale Space,” Proc. British Machine Vision Conference, pp. 53-62, Edinburgh, UK 1996.
F. Mokhtarian et al., “Indexing an Image Database by Shape Content using Curvature Scale Space,” Proc. IEE Colloquium on Intellegient Databases, London 1996.
Abbasi et al., Multimedia Systems, vol. 7, pp. 467-475, No. 6 (1999) “Curvature Scale Space Image in Shape Similarity Retrieval”.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Method for efficient coding of shape descriptor parameters does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Method for efficient coding of shape descriptor parameters, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Method for efficient coding of shape descriptor parameters will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-3793564

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.