High-order entropy error functions for neural classifiers

Data processing: speech signal processing – linguistics – language – Speech signal processing – For storage or transmission

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C704S219000, C704S232000

Reexamination Certificate

active

10332651

ABSTRACT:
An automatic speech recognition system comprising a speech decoder to resolve phone and word level information, a vector generator to generate information vectors on which a confidence measure is based by a neural network classifier (ANN). An error signal is designed which is not subject to false saturation or over specialization. The error signal is integrated into an error function which is back propagated through the ANN.

REFERENCES:
patent: 5276771 (1994-01-01), Manukian et al.
patent: 5509103 (1996-04-01), Wang
patent: 5943661 (1999-08-01), Katz
patent: 6018728 (2000-01-01), Spence et al.
patent: 6456991 (2002-09-01), Srinivasa et al.
Sethi, “Entropy Nets: from decision trees to Neural Networks”, Proceedings of the IEEEvol. 78, No. 10, Oct. 1990, pp. 1605-1613.
Rose, “Deterministic Annealing for CLustering, Compression, Classification, Regression, and Related Optimization Problems”, Proceedings of the IEEE, vol. 86, No. 11, Nov. 1998, pp. 2210-2239.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

High-order entropy error functions for neural classifiers does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with High-order entropy error functions for neural classifiers, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and High-order entropy error functions for neural classifiers will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-3928135

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.