Comparison inequality function based method for accelerated OCR

Image analysis – Pattern recognition – Template matching

Patent

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

382209, G06K 968

Patent

active

055598986

ABSTRACT:
A general method of accelerating the classification of an unclassified input symbol S into a library of pre-classified templates employs a comparison inequality function. The correlation calculation is initiated between the input symbol S and templates retrieved from the library of templates. Templates with a low degree of correlation are excluded prior to completion of the correlation calculation to accelerate the correlation. Sorting by temple intensity permits the temple foreground pixels to be correlated first, which also accelerates the correlation calculation. A second sort by discrimination power permits critical conflict pixels to be correlated first for further accelerating the correlation calculation.

REFERENCES:
patent: 3152318 (1964-10-01), Swift, Jr.
patent: 4467437 (1984-08-01), Tsuruta et al.
patent: 4672678 (1987-06-01), Koezuka et al.
patent: 4975974 (1990-12-01), Nishimijima
patent: 5067166 (1991-11-01), Ito
patent: 5077805 (1991-12-01), Tan
patent: 5177793 (1993-01-01), Murai et al.
patent: 5197107 (1993-03-01), Katsyama et al.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Comparison inequality function based method for accelerated OCR does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Comparison inequality function based method for accelerated OCR , we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Comparison inequality function based method for accelerated OCR will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-1946744

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.