Apparatus and method for motion-vector-aided interpolation...

Image analysis – Image compression or coding – Interframe coding

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C382S300000, C348S400100, C348S421100, C375S240160

Reexamination Certificate

active

07620254

ABSTRACT:
Method for motion-vector-aided interpolation of a pixel of an intermediate image lying between two input images includes a first pixel being selected from a first field and a second pixel being selected from a second field using a first motion vector, and a third pixel being selected from the first field and a fourth pixel being selected from the second field using a second motion vector. Next, an interval specified by video information values of the first pixel and the second pixel or an interval specified by video information values of the third pixel and the fourth pixel is determined and the video information values are mixed such that the video information value of the pixel to be interpolated lies within this interval.

REFERENCES:
patent: 4383272 (1983-05-01), Netravali et al.
patent: 5005077 (1991-04-01), Samad et al.
patent: 5386248 (1995-01-01), Haan et al.
patent: 5446497 (1995-08-01), Keating et al.
patent: 5526053 (1996-06-01), Dorricott et al.
patent: 6385245 (2002-05-01), De Haan et al.
patent: 7068325 (2006-06-01), Gengintani et al.
patent: 2005/0036066 (2005-02-01), Hahn et al.
patent: 2006/0133508 (2006-06-01), Sekiguchi et al.
patent: 2007/0153900 (2007-07-01), Koto et al.
patent: 2008/0084930 (2008-04-01), Sekiguchi et al.
patent: 2009/0135914 (2009-05-01), Sato et al.
patent: 693 15 626 (1993-05-01), None
ISO/IEC 13818-2, Recommendation ITU-T H.262, 1995.
de Haan, “Signalverarbeitungstechniken zur Verbesserung der Bilddarstellung”, 2002.
de Haan et al., “Graceful Degradation in Motion-Compensated Field-Rate Conversion”, Proceedings of the International Workshop on HDTV, pp. 249-256, 1993.
Flierl et al., “Rate-Constrained Multi-Hypothesis Motion-Compensated Prediction for Video Coding”, IEEE, vol. 3, pp. 150-153, 2000.
Franzen et al., “Nichtlineare Polyphaseninterpolation von Zwischenbildern.”
Franzen et al., “Intermediate Image Interpolation using Polyphase Weighted Median Filters”, Proc. SPIE, vol. 4304, pp. 306-317, 2001.
Ojo et al., “Robust Motion-Compensated Video Upconversion”, IEEE Transactions on Consumer Electronics, vol. 43, No. 4, pp. 1045-1056, 1997.
Pelagotti et al., “High Quality Video on MultiMedia PCs,” IEEE, vol. 2, pp. 872-876, 1999.
Piron, “A Temporal Mode Selection in the MPEG-2 Encoder Scheme.”
Kawaguchi et al.: “Frame rate up-conversion considering multiple motion” Image Processing, 1997, Proceedings, International Conference on Santa Barbara, CA, USA, Oct. 26-29, 1997, Los Alamitos, CA, USA, IEEE Comput. Soc, US, Oct. 26, 1997, p. 727-730.
Blume, H.: “Nonlinear vector error tolerant interpolation of intermediate video images by weighted medians-deterministic properties” Signal Processing. Image Communication, Elsevier Science Publishers, Amsterdam, NL, Bd. 14, Nr. 10, Aug. 1999, p. 851-868.
Choi B-T et al.: “New Frame Rate Up-Conversion Using Bi-Directional Motion Estimation” IEEE Transactions on Consumer Electronics; IEEE Inc. New York, US, Bd. 46, Nr. 3. Aug. 2000, p. 603-609.
De Haan et al.: “Robust motion-compensated video upconversion”.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Apparatus and method for motion-vector-aided interpolation... does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Apparatus and method for motion-vector-aided interpolation..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Apparatus and method for motion-vector-aided interpolation... will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-4093999

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.