Video coding/decoding system and video coder and video decoder u

Pulse or digital communications – Bandwidth reduction or expansion – Television or motion video signal

Patent

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

348416, 348699, H04N 736

Patent

active

061342718

ABSTRACT:
A method and apparatus for coding, decoding. The method and apparatus includes calculation of motion vectors of vertices of a patch in an image being encoded and outputting information of motion vectors of vertices and information that specifies the values of horizontal and vertical components of motion vector for each pixel in the patch which are an integer multiple of 1/d of the distance between adjacent pixels, wherein d is an integer not less than 2.

REFERENCES:
patent: 5016102 (1991-05-01), Avis
patent: 5025482 (1991-06-01), Murakami et al.
patent: 5295201 (1994-03-01), Yokohama
"A Basic Study on Motion Compensation with Triangles", Nakaya et al, Technical Report of IEICE, IE90-106, 1990, pp. 9-16.
"Motion Compensation for Video Compression Using Control Grid Interpolation", Sullivan et al, Proc. ICASSP 1991, M9.1, pp. 2713-2716.
"General Approach to Block-Matching Motion Estimation", Seferidis et al, Optical Engineering, Jul. 1993, vol. 32, No. 7, pp. 1464-1474.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Video coding/decoding system and video coder and video decoder u does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Video coding/decoding system and video coder and video decoder u, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Video coding/decoding system and video coder and video decoder u will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-476691

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.