Progressive video detection with aggregated block SADS

Television – Format conversion – Line doublers type

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C348S701000, C348S558000, C348S448000, C348S452000, C348S459000

Reexamination Certificate

active

07612828

ABSTRACT:
A method for detecting progressive material in a video sequence is disclosed. The method generally includes the steps of (A) calculating a plurality of block statistics for each of a plurality of blocks in a current field of the video sequence, (B) calculating a plurality of field statistics by summing the block statistics over all of the blocks in the current field, (C) calculating a noise level for the current field based on a subset of the block statistics from each of the blocks and (D) generating a mode flag for the current field based on both (i) the field statistics and (ii) the noise level, wherein the mode flag identifies if the current field is part of a 2:2 pull-down pattern.

REFERENCES:
patent: 7468756 (2008-12-01), Wyman
patent: 2007/0002169 (2007-01-01), Munsil et al.
Yunwei Jia et al, U.S. Appl. No. 11/272,300, filed Nov. 10, 2005.
Yunwei, Jia et al. U.S. Appl. No. 11/314,631, filed Dec. 20, 2005.
Yunwei Jia et al, U.S. Appl. No. 11/343,119, filed Jan. 30, 2006.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Progressive video detection with aggregated block SADS does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Progressive video detection with aggregated block SADS, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Progressive video detection with aggregated block SADS will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-4134281

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.