Method of processing image information based on object model

Image analysis – Pattern recognition – Feature extraction

Patent

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

382154, 345420, G06K 900, G06K 948, G06T 1710

Patent

active

058870830

ABSTRACT:
A method of processing image information is disclosed, for recognizing the position and attitude of an object with a high speed treatment and improved recognition accuracy. The method comprised of the steps of entry of a stereoscopic image of an object, extraction of edges of the object, and dividing it into segments based on the local feature of the image, adding the apex information to the segments to produce the recognition data, verifying the recognition data with reference to an object model based on local geometric features to detect a corresponding candidate; finely adjusting each of corresponding candidates based on an entire geometric features; and detecting the position and attitude of the object on the basis of recognition including the initial verification and the fine adjustment.

REFERENCES:
"Qualitative and quantitive matching of solid models and images of 3D objs," Masahiko Koizumi et al., 9th International Conf. on Pattern Recognition, IEEE comput. Soc. Press, vol. pp. 681-684, 1988.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Method of processing image information based on object model does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Method of processing image information based on object model, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Method of processing image information based on object model will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-2134215

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.