Method and apparatus for implementing a weighted voting scheme f

Image analysis – Learning systems – Trainable classifiers or pattern recognizers

Patent

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

382304, 382310, G06K 964

Patent

active

055197868

ABSTRACT:
A method and apparatus for implementing a weighted voting scheme for reading and accurately recognizing characters in a scanned image. A plurality of optical character recognition processors scan the image and read the same image characters. Each OCR processor outputs a reported character corresponding to each character read. For a particular character read, the characters reported by each OCR processor are grouped into a set of character candidates. For each character candidate, a weight is generated in accordance with a confusion matrix which stores probabilities of a particular OCR to identify characters accurately. The weights are then compared to determine which character candidate to output.

REFERENCES:
patent: 4032887 (1977-06-01), Roberts
patent: 4760604 (1988-07-01), Cooper et al.
patent: 4876735 (1989-10-01), Martin et al.
patent: 4958375 (1990-09-01), Reilly et al.
patent: 5054093 (1991-10-01), Cooper et al.
patent: 5257323 (1993-10-01), Melen et al.
patent: 5337371 (1994-08-01), Sato et al.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Method and apparatus for implementing a weighted voting scheme f does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Method and apparatus for implementing a weighted voting scheme f, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Method and apparatus for implementing a weighted voting scheme f will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-2045701

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.