Video coding and decoding method using weighted prediction...

Pulse or digital communications – Bandwidth reduction or expansion

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

Reexamination Certificate

active

08005137

ABSTRACT:
A video coding and decoding method using a weighted prediction and an apparatus for the same are provided. The video coding method includes generating a predicted image for a present block; generating a weighted prediction factor which is a scaling factor of the predicted image that minimizes the difference between the present block and the predicted image; generating a weighted prediction image by multiplying the predicted image by the weighted prediction factor; and coding a residual signal generated by subtracting the weighted prediction image from the present block.

REFERENCES:
patent: 6081551 (2000-06-01), Etoh
patent: 6141379 (2000-10-01), Sugiyama
patent: 6381275 (2002-04-01), Fukuhara et al.
patent: 6788740 (2004-09-01), van der Schaar et al.
patent: 2002/0090138 (2002-07-01), Hamanaka
patent: 2003/0156638 (2003-08-01), Van Der Schaar
patent: 2004/0086043 (2004-05-01), Ito et al.
patent: 2004/0246373 (2004-12-01), Kadono et al.
patent: 2005/0195896 (2005-09-01), Huang et al.
patent: 2005/0207496 (2005-09-01), Komiya et al.
patent: 2005/0220192 (2005-10-01), Huang et al.
patent: 2005/0259736 (2005-11-01), Payson
patent: 9-84025 (1997-03-01), None
patent: WO 2004-047977 (2004-05-01), None
patent: 10-2004-0047977 (2004-06-01), None
patent: 10-2004-0095399 (2004-11-01), None
patent: 2162280 (1994-10-01), None
patent: WO 03-036981 (2003-05-01), None
patent: WO 03/075518 (2003-09-01), None
patent: WO 2004/008642 (2004-01-01), None
Schwarz et al. “Joint Scalable Video Model JSVM-2”, ISO/IEC JTC1/SC29/WG11 and ITU-T SG16 Q.6, JVT-0202. 15th Meeting, 2005, p. 1-31.
Schwarz et al. “SNR-scalable extension of H.264/AVC” ISO/IEC JTC/SC29/WG11 and ITU-T SG16 Q.6, JVT-035. 10thMeeting, 2003, p. 3113-3116.
Richardson “Video Coding H.264 and MPEG-4—New Generation Standards.” Technosfera, Moscow, 2003, pp. 230-235.
Office Action dated Nov. 24, 2010, issued in Japanese Application No. 2008-502915.
Haruhisa Kato, “A Study on Fast Weighting Factor Determination for H.264/MEG-4 AVC Weighted Prediction”, Information Technology Letters, vol. 3, Aug. 20, 2004, pp. 229-232.
Mikio Takagi, “Handbook of Image Analysis [Revised Edition],” University of Tokyo Press, Sep. 10, 2004, pp. 1460-1468 and 1493-1498.
Communication dated May 9, 2011 issued by the European Patent Office in counterpart European Patent Application No. 06716510.0.
Shen, Y. et., al. “Adaptive Weighted Prediction in Video Coding”, IEEE International Conference on Multimedia and Expo (ICME), Jun. 27, 2004, pp. 427-430.
Kato, H. et., al. “Weighting Factor Determination Algorithm for H.264/MPEG-4 AVC Weighted Prediction”, IEEE 6th Workshop on Multimedia Signal Processing, Sep. 29, 2004, pp. 27-30.
Flierl, M et., al. “Rate-Constrained Multihypothesis Prediction for Motion-Compensated Video Compression” IEEE Transactions on Circuits and Systems for Video Technology, vol. 12, No. 11, Nov. 1, 2002, pp. 957-969.
Tubaro, S. et., al. “A two layers video coding scheme for ATM networks”, Signal Processing: Image Communications, Elsevier Science Publishers B.V., Jun. 1, 1991, pp. 129-141.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Video coding and decoding method using weighted prediction... does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Video coding and decoding method using weighted prediction..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Video coding and decoding method using weighted prediction... will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-2791187

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.