Decoding input symbols to input/output hidden markoff models

Data processing: speech signal processing – linguistics – language – Speech signal processing – Recognition

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C704S255000

Reexamination Certificate

active

06226613

ABSTRACT:

BACKGROUND OF THE INVENTION
1. Field of Invention
The invention relates to a method and apparatus for computation of products of a vector and a sequence of matrices.
2. Description of Related Art
Vector matrix products occur often in applications such as speech recognition, signature verification, error correction code decoding, etc. Techniques such as the forward-backward-algorithm (FBA) are commonly used for such applications. However, equipment that performs these algorithms requires large amounts of memory for storing all the matrices and intermediate matrix products needed to support these algorithms.
For example, when Hidden Markov Models (HMM) are applied to describe a communication channel, products of sequences of probability density matrices are needed to estimate the a posteriori probabilities of transmitted symbols given the received symbols. The FBA requires that the sequence of matrices multiplied by the first vector in a recursive manner in the forward part of the algorithm to be stored in memory and the decoding process can start only after a long sequence of symbols has been received. This is intolerable in many applications (a telephone application, for example) which impose strict constraints on the message delivery delay. Thus, new technology is needed to improve the vector-matrix product calculation which enables a decoder to estimate the product without waiting for the whole sequence to be received. This technology enables a designer to trade the product estimation accuracy for smaller delays in information delivery.
SUMMARY OF THE INVENTION
The invention provides a method and apparatus that performs a fixed-lag computation process. Instead of storing all the forward vectors in memory, a sequence of partial matrix products is computed and stored in memory. A first vector is multiplied by the sequence of matrices recursively to generate the forward vector as in the standard FBA, but without saving the results in memory, because the algorithm does not have a backward part. Then the sequence of the partial matrix products is recursively updated by multiplying then by the next matrix in the sequence. The backward vector is estimated as a product of the updated partial product by the second vector to estimate the backward vector. And, finally, the updated forward and backward vectors are multiplied to obtain the desired result.
Thus, the fixed-lag process of the invention by estimating the backward vectors recursively in the forward fashion replaces the FBA by the forward-only algorithm which eliminates the need of saving the forward vectors. The price paid for this is saving and updating in memory the partial matrix products. Accordingly, memory requirements and decoding delay are reduced when using the fixed-lag process to decode communication channel messages, for example.
Further memory reduction may be achieved if matrix inversion is applied. For this case, only one partial matrix product must be saved in memory. Thus, additional memory savings may be achieved by expending computational resources to invert matrices.


REFERENCES:
William Turin, “The Forward-Backward Algorithm—Work Project No. 311614-2003”, Technical Memorandum, AT&T, Nov. 1997.*
William Turin, “MAP Decoding using the EM Algorithm,” Proc. IEEE 49th Vehicular Technology Conference, vol. 3, p. 1866-1870, May 1999.*
William Turin and Michele Zorzi, “Performance of Delay-Constrained Communications over Diverse Burst-Error Channels,” Proc. IEEE 50th Vehicular Technology Conference, vol. 3, p. 1305-1309, Sep. 1999.*
William Turn, “MAP Decoding in Channels with Memory,” IEEE Trans. on Communications, vol. 48, No. 5, p. 757-763, May 2000.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Decoding input symbols to input/output hidden markoff models does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Decoding input symbols to input/output hidden markoff models, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Decoding input symbols to input/output hidden markoff models will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-2504401

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.