Character segmentation method and apparatus

Image analysis – Image segmentation – Segmenting individual characters or words

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C382S174000

Reexamination Certificate

active

11004738

ABSTRACT:
An electronic device (1100) and a method for character segmentation (100) includes an image analyzer (1110) that generates individual character images. The image analyzer binarizes (115) a gray scale image (200) of a horizontal row of characters by using a general threshold method to generate a first image (300). The image analyzer also binarizes (120) the gray scale image using an edge detection method to generate a second image (405). The image analyzer determines (125) a character row region (425) of the second image by using horizontal projection analysis. The image analyzer isolates (130) the character row region of the first image using the character row region of the second image. The image analyzer uses the character row region to generate (135) individual character images. The electronic device may include an image capture device (1105) and a character recognition program (1115).

REFERENCES:
patent: 5384864 (1995-01-01), Spitz
patent: 6327384 (2001-12-01), Hirao et al.
patent: 6449391 (2002-09-01), Ku
patent: 6473517 (2002-10-01), Tyan et al.
patent: 6909805 (2005-06-01), Ma et al.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Character segmentation method and apparatus does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Character segmentation method and apparatus, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Character segmentation method and apparatus will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-3868316

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.