Method and apparatus for generating information on recognized ch

Image analysis – Image transformation or preprocessing – Changing the image coordinates

Patent

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

382177, 382181, G06K 932

Patent

active

055090923

ABSTRACT:
Characters are recognized by a conventional OCR apparatus and converted into outline font form. The system includes a recognition device for optically reading printed characters and recognizing those to obtain information on the recognized characters consisting of text code information and character layout information, an outline font table for retaining outline font data of characters, a character box enlarging function for enlarging enclosing rectangles of the recognized characters obtained by the recognition device by a ratio of an outline font character box to a black pixel component to be drawn in the character box while referring to the outline font table, and modifying the information on the recognized characters by using the enlarged enclosing rectangles as new character boxes of the outline font.

REFERENCES:
patent: 4745561 (1988-05-01), Hirosawa et al.
patent: 5093868 (1992-03-01), Tanaka et al.
patent: 5123062 (1992-06-01), Sangu
patent: 5142613 (1992-08-01), Morikawa et al.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Method and apparatus for generating information on recognized ch does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Method and apparatus for generating information on recognized ch, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Method and apparatus for generating information on recognized ch will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-331711

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.