Apparatus and method for categorizing image and related...

Image analysis – Image enhancement or restoration – Edge or contour enhancement

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C382S275000, C382S300000

Reexamination Certificate

active

07933467

ABSTRACT:
The present invention provides an apparatus and a method for de-interlacing. The apparatus includes an edge detection module, a statistics module, and an interpolation circuit. The edge detection module performs an edge detection operation on a plurality of pixels of an image so as to generate edge information corresponding to the image. The statistics module performs a detection window based statistics operation on the edge information so as to generate statistics information corresponding to the image. The interpolation circuit interpolates the image according to the statistics information so as to generate an intra-field interpolation signal corresponding to the image.

REFERENCES:
patent: 5029108 (1991-07-01), Lung
patent: 6614484 (2003-09-01), Lim et al.
patent: 6799168 (2004-09-01), He
patent: 2003/0193486 (2003-10-01), Estrop
patent: 2004/0075764 (2004-04-01), Law
patent: 2007/0070244 (2007-03-01), Wyman et al.
patent: 2000-224551 (2000-08-01), None
patent: 2005-208613 (2005-08-01), None

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Apparatus and method for categorizing image and related... does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Apparatus and method for categorizing image and related..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Apparatus and method for categorizing image and related... will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-2642414

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.