Code conversion method, apparatus, program, and storage medium

Data processing: speech signal processing – linguistics – language – Speech signal processing – For storage or transmission

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C704S221000, C704S223000, C370S466000

Reexamination Certificate

active

07630884

ABSTRACT:
The object of this invention is converting a code that has been obtained by encoding speech by one particular system is converted to code that can be decoded by another system with high speech quality, and moreover, with a low computational load in transmitting speech signal between different systems. This invention comprising an adaptive codebook (ACB) delay search range control circuit (1250in FIG.7) for calculating a search range control value from first adaptive codebook delay that is stored and held and said second adaptive codebook delay that is stored and held, and an adaptive codebook encoding circuit (1220in FIG.7) for calculating autocorrelation using speech signal from ACB delay including excitation signal and delay that is within a range stipulated by said search range control value, and selects the maximum autocorrelation as second adaptive codebook delay, and supplying code that corresponds to said second adaptive codebook delay as code of an adaptive codebook delay in said second code string.

REFERENCES:
patent: 5664055 (1997-09-01), Kroon
patent: 5699485 (1997-12-01), Shoham
patent: 5704003 (1997-12-01), Kleijn et al.
patent: 5778334 (1998-07-01), Ozawa et al.
patent: 5873058 (1999-02-01), Yajima et al.
patent: 5884252 (1999-03-01), Ozawa
patent: 5995923 (1999-11-01), Mermelstein et al.
patent: 6584441 (2003-06-01), Ojala et al.
patent: 6829579 (2004-12-01), Jabri et al.
patent: 7016831 (2006-03-01), Suzuki et al.
patent: 7092875 (2006-08-01), Tsuchinaga et al.
patent: 7142559 (2006-11-01), Choi et al.
patent: 7184953 (2007-02-01), Jabri et al.
patent: 7263481 (2007-08-01), Jabri et al.
patent: 7315814 (2008-01-01), Vainio et al.
patent: 7318024 (2008-01-01), Murashima
patent: 7433815 (2008-10-01), Jabri et al.
patent: 2002/0196762 (2002-12-01), Choi et al.
patent: 0 820 052 (1998-01-01), None
patent: 61-180299 (1986-08-01), None
patent: 01-152894 (1989-06-01), None
patent: 04-048239 (1992-08-01), None
patent: 08-046646 (1996-02-01), None
patent: 08-146997 (1996-06-01), None
patent: 8-185199 (1996-07-01), None
patent: 8-328597 (1996-12-01), None
patent: 9-321783 (1997-12-01), None
patent: 09-321783 (1997-12-01), None
patent: 11-041286 (1999-02-01), None
patent: 2000-332749 (2000-11-01), None
patent: WO 99/65017 (1999-12-01), None
patent: 2001-0022714 (2001-03-01), None
patent: WO 00/48170 (2000-08-01), None
Code-Excited Linear Prediction (CELP): High-Quality Speech at Very Low Bit Rates, by Manfred Scroeder et al., et al., 1985 IEEE, pp. 937-940.
Improving Capability of Speech Codes in Clean and Frame Erasured Channel Environments, by Hong-Goo Kang et al., 2000 IEEE, pp. 78-80.
ITU-T, Coding of Speech at 8 kbit/s Using Conjugate-Structure Algebraic-Code-Excited Linear-Prediction (CS-ACELP), ITU-Recommendation G.729, Mar. 1996, pp. 10-14.
Kyung Tae Kim, et al., “An Efficient Transcoding Algorithm for G.723.1 and EVRC Speech Coders,” 2001 IEEE 54thVehicular Technology Conference, Proceedings, vol. 1 of 4, Conf. 54, XP010562224.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Code conversion method, apparatus, program, and storage medium does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Code conversion method, apparatus, program, and storage medium, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Code conversion method, apparatus, program, and storage medium will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-4136611

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.