Apparatus and method for processing images, recording...

Image analysis – Pattern recognition – Feature extraction

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C382S299000, C382S266000

Reexamination Certificate

active

10779629

ABSTRACT:
An image input section outputs the size of an image to a division-number setting section and outputs the image to an edge enhancing section, which enhances the edges of the image. An edge extracting section extracts the edges, an edge evaluating section checks whether each pixel belongs to an edge, and an edge counting section outputs the frequencies of the edges. A DFT section applies a Fourier transform to the edge frequencies which are output as power spectra, and a peak extracting section outputs the spatial frequencies at the peaks of the power spectra. A division-number setting section determines the number of divisions from the spatial frequencies. An image dividing section divides the image into the determined number of blocks and a colored-image output section assigns a particular pixel value to all pixels in each of the blocks.

REFERENCES:
patent: 6798542 (2004-09-01), Kimura et al.
patent: 411142117 (1999-05-01), None
patent: 02001082954 (2001-03-01), None
patent: 02004022106 (2004-01-01), None

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Apparatus and method for processing images, recording... does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Apparatus and method for processing images, recording..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Apparatus and method for processing images, recording... will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-3939884

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.