Method of chroma-keying for a digital video compression system

Image analysis – Color image processing – Compression of color images

Patent

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

348587, 348592, H04N 975

Patent

active

060849827

ABSTRACT:
A method of performing chroma-key coding including the steps of defining color regions and quantifying the number of pixels in the regions, classifying macroblocks in accordance with the number of pixels thus quantified, assigning chroma complexity weights for use in the computation of quantization step-size based on the classification of the macroblocks, computing the quantization step-size, performing quantization and performing variable length coding.

REFERENCES:
patent: 5621466 (1997-04-01), Miyane et al.
patent: 5835158 (1998-11-01), Lowe
Horne et al., "Study of the Characteristics of the MPEG2 4:2:2 Profile--Application of MPEG2 in Studio Environment", IEEE Transactions on Circuits and Systems for Video Technology, vol. 6, No. 3, pp. 251-272, Jun. 1996.
"Televison Animation Store: Digital Chroma-Key and Mixer Units" by V.G. Devereux; BBC Research and Development Report, published 1984.
"A proposed Computer-Controlled Digital HDTV Chroma-Key System", by Z. Misawa et al, S.M.P.T.E. Journal, Mar. 1995.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Method of chroma-keying for a digital video compression system does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Method of chroma-keying for a digital video compression system, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Method of chroma-keying for a digital video compression system will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-1493551

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.