OCR method and apparatus using image equivalents

Image analysis – Pattern recognition – Classification

Patent

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

382218, 382229, G06K 968, G06K 962, G06K 972

Patent

active

057647992

ABSTRACT:
An OCR 300 stores signals representative of reference characters and scans a document 302 to generate a bit mapped digitized image of the document. After the characters and the words are recognized and candidate characters are identified, the initial results are post-processed to compare clusters of identical images to the candidates. Where the candidates of all equivalent images in a cluster are the same, the candidates are output as representative of the image on the document. Where the candidates are different, a majority of identical candidates determines the recognized candidates. Other post-processing operations include verification and re-recognition.

REFERENCES:
patent: 4355302 (1982-10-01), Aldefeld et al.
patent: 4610025 (1986-09-01), Blum et al.
patent: 4799271 (1989-01-01), Nagasawa et al.
patent: 5278918 (1994-01-01), Bernzott et al.
patent: 5278920 (1994-01-01), Bernzott et al.
patent: 5325444 (1994-06-01), Cass et al.
patent: 5327342 (1994-07-01), Roy
patent: 5410611 (1995-04-01), Huttenlocher et al.
patent: 5519786 (1996-05-01), Courtney et al.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

OCR method and apparatus using image equivalents does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with OCR method and apparatus using image equivalents, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and OCR method and apparatus using image equivalents will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-2211053

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.