Method for motion estimation using a low-bit edge image

Pulse or digital communications – Bandwidth reduction or expansion – Television or motion video signal

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C382S268000

Reexamination Certificate

active

07151798

ABSTRACT:
The invention provides a method for a motion estimation algorithm. The motion estimation algorithm using low bit resolution integrated edge image instead of luminance image to obtain difference block with small AC coefficients, edge image created by filters is employed to improve encoding quality, on the other hand, operation cost is reduced to low bit resolution. The invention also provides a method for a motion estimation algorithm with a new algorithm using low-bit resolution oriented edge image. Using low-bit resolution oriented edge in motion estimation can result in flatter image blocks which is in demanded by texture-compress unit (such as DCT), as a result, the encoding efficiency is improved, and the operation load is reduced by low-bit resolution.

REFERENCES:
patent: 5512956 (1996-04-01), Yan
patent: 6370279 (2002-04-01), Paik
patent: 6961466 (2005-11-01), Imagawa et al.
patent: 2003/0142750 (2003-07-01), Oguz et al.
patent: 2004/0066964 (2004-04-01), Neubauer et al.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Method for motion estimation using a low-bit edge image does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Method for motion estimation using a low-bit edge image, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Method for motion estimation using a low-bit edge image will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-3681361

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.