Method and apparatus for detecting scene-cuts in a block-based v

Television – Bandwidth reduction system – Data rate reduction

Patent

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

348700, H04N 734

Patent

active

057241007

ABSTRACT:
The scene cut detector compares predicted macroblocks from an anchor image to input macroblocks from an input image on a macroblock-by-macroblock basis to generate a residual macroblock representing the difference between each predicted macroblock and each input macroblock. A variance for each residual macroblock and a variance for each input macroblock is computed after each comparison. The residual variance is compared to the input macroblock variance. Whenever the variance of the residuals macroblock exceeds the variance of the input macroblock, a counter is incremented. The scene cut detector repeats this process until each macroblock in the predicted image is compared to each input macroblock. If the count value ever exceeds a threshold level while a input image is being processed, the scene cut detector sets a scene cut indicator flag.

REFERENCES:
patent: 5377051 (1994-12-01), Lane et al.
patent: 5404174 (1995-04-01), Sugahara
patent: 5459517 (1995-10-01), Kunitake et al.
patent: 5493345 (1996-02-01), Ishikawa et al.
patent: 5532746 (1996-07-01), Chang

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Method and apparatus for detecting scene-cuts in a block-based v does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Method and apparatus for detecting scene-cuts in a block-based v, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Method and apparatus for detecting scene-cuts in a block-based v will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-2253155

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.