Packet loss concealment for sub-band predictive coding based...

Data processing: speech signal processing – linguistics – language – Speech signal processing – For storage or transmission

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C704S230000, C704S229000, C704S200100, C704S204000, C704S212000, C455S070000, C455S045000, C455S205000, C455S307000, C341S143000

Reexamination Certificate

active

08078458

ABSTRACT:
A technique is described for concealing the effect of a lost frame in a series of frames representing an encoded audio signal in a sub-band predictive coding system. In accordance with the technique, a first synthesized sub-band audio signal is synthesized, wherein synthesizing the first synthesized sub-band audio signal comprises performing waveform extrapolation based on a stored first sub-band decoded audio signal. A second synthesized sub-band audio signal is also synthesized, wherein synthesizing the second synthesized sub-band audio signal comprises performing waveform extrapolation based on the stored second sub-band decoded audio signal. The first synthesized sub-band audio signal and the second synthesized sub-band audio signal are combined to generate a synthesized full-band output audio signal corresponding to a lost frame.

REFERENCES:
patent: 4935963 (1990-06-01), Jain
patent: 6351730 (2002-02-01), Chen
patent: 6408267 (2002-06-01), Proust
patent: 6549587 (2003-04-01), Li
patent: 6665637 (2003-12-01), Bruhn
patent: 7031926 (2006-04-01), Makinen et al.
patent: 7047187 (2006-05-01), Cheng et al.
patent: 7047190 (2006-05-01), Kapilow
patent: 7177804 (2007-02-01), Wang et al.
patent: 7233893 (2007-06-01), Sung et al.
patent: 7272554 (2007-09-01), Serizawa et al.
patent: 7467072 (2008-12-01), Adam
patent: 7467082 (2008-12-01), Sung et al.
patent: 7502734 (2009-03-01), Jelinek
patent: 7619995 (2009-11-01), El-Hennawey et al.
patent: 7693710 (2010-04-01), Jelinek et al.
patent: 7707034 (2010-04-01), Sun et al.
patent: 7805293 (2010-09-01), Takada et al.
patent: 2002/0080779 (2002-06-01), Leblanc
patent: 2002/0102942 (2002-08-01), Taori et al.
patent: 2002/0123887 (2002-09-01), Unno
patent: 2003/0074197 (2003-04-01), Chen
patent: 2003/0200083 (2003-10-01), Serizawa et al.
patent: 2004/0078194 (2004-04-01), Liljeryd et al.
patent: 2005/0015242 (2005-01-01), Gracie et al.
patent: 2005/0154584 (2005-07-01), Jelinek et al.
patent: 2006/0045138 (2006-03-01), Black et al.
patent: 2007/0147518 (2007-06-01), Bessette
patent: 2007/0150262 (2007-06-01), Mori et al.
patent: 2007/0174047 (2007-07-01), Anderson et al.
patent: 2007/0213976 (2007-09-01), Sung et al.
patent: 2007/0225971 (2007-09-01), Bessette
patent: 2008/0027711 (2008-01-01), Rajendran et al.
patent: 2008/0027715 (2008-01-01), Rajendran et al.
patent: 2008/0033585 (2008-02-01), Zopf
patent: 2008/0046233 (2008-02-01), Chen et al.
patent: 2008/0046236 (2008-02-01), Thyssen et al.
patent: 2008/0046237 (2008-02-01), Zopf et al.
patent: 2008/0046248 (2008-02-01), Chen et al.
patent: 2008/0046249 (2008-02-01), Thyssen et al.
patent: 2008/0046252 (2008-02-01), Zopf et al.
patent: 2008/0092019 (2008-04-01), Lakaniemi et al.
patent: 2008/0126086 (2008-05-01), Vos et al.
patent: 2009/0232228 (2009-09-01), Thyssen
patent: 2009/0240492 (2009-09-01), Zopf et al.
patent: 2009/0299755 (2009-12-01), Ragot et al.
patent: 2009/0319264 (2009-12-01), Yoshida et al.
patent: 2010/0121646 (2010-05-01), Ragot et al.
patent: 2010/0228541 (2010-09-01), Oshikiri
patent: 1096477 (2001-05-01), None
patent: 1288916 (2003-03-01), None
patent: 1684267 (2006-07-01), None
patent: 2008022176 (2008-02-01), None
patent: 2008022176 (2008-02-01), None
patent: 2008022181 (2008-02-01), None
patent: 2008022181 (2008-02-01), None
patent: 2008022184 (2008-02-01), None
patent: 2008022184 (2008-02-01), None
patent: 2008022200 (2008-02-01), None
patent: 2008022200 (2008-02-01), None
patent: 2008022207 (2008-02-01), None
patent: 2008022207 (2008-02-01), None
Chibani, Mohamed et al., “Resynchronization of the Adaptive Codebook in a Constrained CELP CODEC After a Frame Erasure”, Acoustics, Speech and Signal Processing, ICASSP 2006 Proceedings, IEEE International Conference, Toulouse,(May 14-19, 2006), pp. 1-13.
Thyssen, Jes et al., “A Candidate for the ITU-T G.722 Packet Loss Concealment Standard”, ICASSP, (Apr. 15, 2007), pp. 549-552.
Serizawa, Masahiro et al., “A Packet Loss Concealment Method Using Pitch Waveform Repetition and Internal State Update on the Decoded Speech for the Sub-band ADPCM Wideband Speech CODEC”, Speech Coding, IEEE Workshop Proceedings, (Oct. 6-9, 2002), pp. 68-70.
Shetty, Niranjan et al., “Improving the Robustness of the G.722 Wideband Speech Codec to Packet Losses for Voice over WLANS”, Acoustics, Speech and Signal Processing, ICASSP Proceedings, IEEE International Conference, Toulouse,(May 14-19, 2006), pp. V365-V368.
Thyssen, Jes et al., “Detailed Description of the Broadcom G. 722 PLC Candidate Algorithm”, ITU Study Group 16, (Nov. 2006), 37 pages.
“Coding of Speech at 16 kbit/s using low-delay code excited linear prediction”, Frame or Packet loss concealment for the LD—celp decoder, (May 1, 1999), 20 pages.
Goodman, et al., “Waveform Substitution Techniques for Recovering Missing Speech Segments in Packet Voice Communications”, Acoustics, Speech and Signal Processing, IEEE Transactions on, (Dec. 1986), vol. 34, No. 6, pp. 1440-1448.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Packet loss concealment for sub-band predictive coding based... does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Packet loss concealment for sub-band predictive coding based..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Packet loss concealment for sub-band predictive coding based... will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-4266073

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.