De-interlacing algorithm responsive to edge pattern

Television – Format conversion – Line doublers type

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

Reexamination Certificate

active

10898526

ABSTRACT:
There is provided a de-interlacing method including: receiving first and third image lines; selecting an upper vector having an N number of pixels among the first image lines, and selecting a lower vector having an N number of pixels among the third image lines; obtaining a weighted value on the basis of a relation between the pixels within the upper vector and a relation between the pixels within the lower vector; selecting an edge direction from the selected upper vector, the selected lower vector, and the obtained weighted value; and interpolating a pixel belonging to a second image line between the first and third image lines on the basis of the selected edge direction.

REFERENCES:
patent: 5682205 (1997-10-01), Sezan et al.
patent: 5786862 (1998-07-01), Kim et al.
patent: 6606126 (2003-08-01), Lim et al.
patent: 6909752 (2005-06-01), Zhou
patent: 6999128 (2006-02-01), Kasahara et al.
patent: 7023487 (2006-04-01), Adams
patent: 7035481 (2006-04-01), Kim et al.
patent: 2002/0047930 (2002-04-01), Zhou
patent: 2004/0135926 (2004-07-01), Song et al.
patent: 0 687 104 (1995-12-01), None

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

De-interlacing algorithm responsive to edge pattern does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with De-interlacing algorithm responsive to edge pattern, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and De-interlacing algorithm responsive to edge pattern will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-3723960

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.