Method and system for correcting image position based upon prede

Image analysis – Pattern recognition – Feature extraction

Patent

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

382202, 382286, G06K 946

Patent

active

060185937

ABSTRACT:
An optical character recognition method and system adjusts positional, rotational and scaling differences between an input form image and a standard form image based upon selected corresponding portions of the two images prior to optically recognizing characters in predetermined areas of the input form. The corresponding portions include a predetermined mark such as a cross-line portions and or predetermined characters, and a selection process of these corresponding portions eliminates unlikely pairs of the corresponding portions based upon a distribution histogram analysis.

REFERENCES:
patent: 4777651 (1988-10-01), McCann et al.
patent: 4849679 (1989-07-01), Taft et al.
patent: 4910787 (1990-03-01), Umeda et al.
patent: 5065438 (1991-11-01), Hirose et al.
patent: 5075895 (1991-12-01), Bessho
patent: 5233670 (1993-08-01), Dufour et al.
patent: 5561721 (1996-10-01), Mutz
patent: 5572603 (1996-11-01), Koike
patent: 5621811 (1997-04-01), Roder et al.
patent: 5703963 (1997-12-01), Kojima et al.
patent: 5774584 (1998-06-01), Matsumoto et al.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Method and system for correcting image position based upon prede does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Method and system for correcting image position based upon prede, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Method and system for correcting image position based upon prede will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-2321603

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.