Neural network architecture for gaussian components of a mixture

Patent

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

G06F 1518, G06E 300

Patent

active

057907587

ABSTRACT:
A neural network for classifying input vectors to an outcome class under the assumption that the classes are characterized by mixtures of component populations having a multivariate Gaussian likelihood distribution. The neural network comprises an input layer for receiving components of an input vector, two hidden layers for generating a number of outcome class component values, and an output layer. The first hidden layer includes a number of first layer nodes each connected receive input vector components and generate a first layer output value representing the absolute value of the sum of a function of the difference between each input vector component and a threshold value. The second hidden layer includes a plurality of second layer nodes, each second layer node being connected to the first layer nodes and generating an outcome class component value representing a function related to the exponential of the negative square of a function of the sum of the first layer output values times a weighting value. The output layer includes a plurality of output nodes, each associated with an outcome class, for generating a value that represents the likelihood that the input vector belongs to that outcome class.

REFERENCES:
patent: 5220618 (1993-06-01), Sirat et al.
patent: 5276771 (1994-01-01), Manukian et al.
patent: 5408585 (1995-04-01), Burel
patent: 5455892 (1995-10-01), Minot et al.
patent: 5479576 (1995-12-01), Watanabe et al.
patent: 5568591 (1996-10-01), Minot et al.
patent: 5572597 (1996-11-01), Chang et al.
Streit et al., Maximum likelihood training of probabilistic Neural Networ IEEE transactions on neural networks vol. 5, pp. 764-783, Sep. 1994.
Streit, A neural network for optimum Neyman-Pearson classification, IJCNN International joint conference on neural network, pp. 685-690, Jun. 21, 1990.
Goodman et al., A learning algorithm for multi-layer preceptrons with hard-limiting threshold units, 1994 IEEE conference on neural networks, pp. 193-197, Jul. 2, 1994.
Lippman, A critical overview of neural network pattern classifiers, 1991 IEEE workshop, pp. 266-275, Oct. 1, 1991.
Musavi et al., Improving the performance of probabilistic neural networks, IJCNN International joint conference on Neural networks, pp. 595-600, Jun. 11, 1992.
Farrell et al., Neural tree network/vector quantization probability estimators for speaker recognition, Neural network for signal processing IV Proceedings of 1994 IEEE workshop, pp. 279-288, Sep. 8, 1994.
Tarassenko et al., Supervised and unsupervised learning in radial basis function classifiers, IEE proceedings, pp. 210-216, Aug. 1994.
McMicheal, Bayesian growing and pruning strategies for MAP-optimal estimation of Gaussian mixture models, Fourth International conference on artificial neural networks, pp. 364-368, Jun. 28, 1995.
Kosonocky et al., A continious density neural tree network word spotting system, 1995 International conference on Acoustics, Speech and signal processing, pp. 305-308, May 12, 1995.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Neural network architecture for gaussian components of a mixture does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Neural network architecture for gaussian components of a mixture, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Neural network architecture for gaussian components of a mixture will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-1187639

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.