Apparatus and method for coding excitation parameters in a very

Multiplex communications – Channel assignment techniques – Only active channels transmitted

Patent

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

704203, 704207, 704208, H04J 317

Patent

active

056663501

ABSTRACT:
An apparatus codes excitation parameters for very low bit rate voice messaging using a method that processes a voice message to generating speech parameters. The speech parameters are separated (316) to produce a first group of energy parameters and a second group of pitch and voicing parameters. Subsequently, the first group of energy parameters are encoded and compressed using a non-uniform root-mean-square scalar process (318) to create a first plurality of encoded data. Additionally, the second group of pitch and voicing parameters are compressed, encoded, and combined into a single parameter using a three slope vector encoding process (320) that creates a second plurality of encoded data. Finally, the first and second plurality of encoded data are multiplexed (322) to create a multiplexed signal for transmission, the multiplexed signal representing the voice message.

REFERENCES:
patent: 4076958 (1978-02-01), Fulgum
patent: 5271089 (1993-12-01), Ozawa
patent: 5473727 (1995-12-01), Nishiguchi et al.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Apparatus and method for coding excitation parameters in a very does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Apparatus and method for coding excitation parameters in a very , we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Apparatus and method for coding excitation parameters in a very will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-74362

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.