Shared logic for decoding and deinterlacing of compressed video

Pulse or digital communications – Bandwidth reduction or expansion – Television or motion video signal

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C375S240030

Reexamination Certificate

active

07965770

ABSTRACT:
One embodiment includes a method that includes receiving a compressed video stream. The method also includes decoding a number of blocks of the compressed video stream to output a number of blocks of decoded video data. The decoding is based on at least one motion compensation vector. The method also includes deinterlacing the number of blocks of the decoded video data to output deinterlaced video data. The deinterlacing of one of the blocks of the number of blocks is based on the at least one motion compensation vector if a prediction error energy for the at least one motion compensation vector for the block is less than a threshold.

REFERENCES:
patent: 6269484 (2001-07-01), Simsic et al.
patent: 6968007 (2005-11-01), Barrau
patent: 7203234 (2007-04-01), Zeng
patent: 2001/0002205 (2001-05-01), Beattie
patent: 2005/0175099 (2005-08-01), Sarkijarvi et al.
Johnson et al. (Johnson), Frequency Scalable Video Coding Using MDCT, IEEE, pp. V-477-V480, 1994.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Shared logic for decoding and deinterlacing of compressed video does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Shared logic for decoding and deinterlacing of compressed video, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Shared logic for decoding and deinterlacing of compressed video will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-2724324

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.