Character recognition method and apparatus that re-inputs image

Image analysis – Image transformation or preprocessing – Changing the image coordinates

Patent

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

382318, G06K 903

Patent

active

057153361

ABSTRACT:
There is disclosed a character recognition method, and an apparatus therefor, capable of improving the accuracy of character recognition employing character normalization and direction index counting. The original image information is read with a resolving power matching the character size, so that the result of direction index counting is not distorted by the character size change in the normalization.

REFERENCES:
patent: 3223973 (1965-12-01), Chatten
patent: 3273124 (1966-09-01), Greanias
patent: 3921136 (1975-11-01), Bar-Lev
patent: 4045772 (1977-08-01), Bouton et al.
patent: 4516265 (1985-05-01), Kizu et al.
patent: 4769851 (1988-09-01), Nishijima et al.
patent: 5197107 (1993-03-01), Katsuyama et al.
"Bildvorbereitung Fur Die Automatische Zeichenerkennung", J. Schurmann Wissenschaftliche Berichte Aeg Telefunken, vol.47, No.3/4, 1974, pp.90-99.
"Optimizing The Digital Learning Network For . . . " S.N. Abbas, et al., 7th European Conference on Electronics, IEEE, Apr. 1986, Paris, FR pp. 505-513.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Character recognition method and apparatus that re-inputs image does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Character recognition method and apparatus that re-inputs image , we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Character recognition method and apparatus that re-inputs image will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-669807

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.