Pattern evaluation method, pattern evaluation system and...

Image analysis – Pattern recognition – Feature extraction

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C250S559220, C250S559360, C382S144000, C382S199000, C382S203000

Reexamination Certificate

active

06985626

ABSTRACT:
A pattern evaluation method includes processing image data of at least one pattern serving as an object to be evaluated and detecting coordinates of edge points of the pattern in an image of the image data, making pairs of edge points from the edge points of the pattern, setting an arbitrary axis, calculating a distances between the edge points of each pair of the pairs of edge points and an angle between a straight line connecting the edge points of the pair and the axis, preparing a distance/angle distribution map which represents distribution of the distances and angles of the pairs of edge points, extracting a characteristic point of the distance/angle distribution map and analyzing the pattern on the basis of the extracted characteristic point.

REFERENCES:
patent: 5872863 (1999-02-01), Tsuboi et al.
patent: 6529258 (2003-03-01), Watanabe et al.
patent: 6549648 (2003-04-01), Rinn
patent: 08-194734 (1996-07-01), None
patent: 11-201919 (1999-07-01), None

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Pattern evaluation method, pattern evaluation system and... does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Pattern evaluation method, pattern evaluation system and..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Pattern evaluation method, pattern evaluation system and... will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-3603077

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.