Method and system for bidirectional motion compensation for comp

Television – Image signal processing circuitry specific to television – Motion vector generation

Patent

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

348402, 348407, 348413, 348416, H04N 736, H04N 750

Patent

active

056991282

ABSTRACT:
A motion compensation processor reads a small regions at the center portion of a forward reference region and a small region at the center portion of a backward reference region preliminarily assigned thereto and determines a coding type. Another motion compensation processor reads a small region at the left side of the forward reference region and a small region at the right side of the backward reference region and determines a coding type. A further motion compensation processor reads a small region at the left side of the forward reference region and a small region at the right side of the backward reference region and determines a coding type. By this parallel processing is enabled to obtain a prediction block having high similarity to a current block more efficiently than the prior art and facilitate realization of a bidirectional predicting system.

REFERENCES:
patent: 5012336 (1991-04-01), Gillard
patent: 5132792 (1992-07-01), Yonemitsu
patent: 5227878 (1993-07-01), Puri

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Method and system for bidirectional motion compensation for comp does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Method and system for bidirectional motion compensation for comp, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Method and system for bidirectional motion compensation for comp will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-211698

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.