Patent
1992-03-27
1993-01-26
MacDonald, Allen R.
G06F 1518
Patent
active
051827948
ABSTRACT:
A teaching method for a recurrent neural network having hidden, output and input neurons calculates weighting errors over a limited number of propagations of the network. This process permits the use of conventional teaching sets, such as are used with feedforward networks, to be used with recurrent networks. The teaching outputs are substituted for the computed activations of the output neurons in the forward propagation and error correction stages. Back propagated error from the last propagation is assumed to be zero for the hidden neurons. A method of reducing drift of the network with respect to a modeled process is also described and a forced cycling method to eliminate the time lag between network input and output.
REFERENCES:
Generalization of Back-Propagation to Recurrent Neural Networks; F. J. Pineda; Physical Review Letters; vol. 59, No. 19; pp. 2229-2232; Nov. 9, 1987.
"A Learning Algorithm for Continually Running Fully Recurrent Neural Networks", Ronald J. Willaims, David Zipser, Neural Computation 1,270-280 1989.
"Dynamics and Architecture for Neural Computation", Fernando J. Pineda, Journal of Complexity 4,216-245, 1988.
Davis Wesley
Gasperi Michael L.
Allen-Bradley Company Inc.
MacDonald Allen R.
LandOfFree
Recurrent neural networks teaching system does not yet have a rating. At this time, there are no reviews or comments for this patent.
If you have personal experience with Recurrent neural networks teaching system, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Recurrent neural networks teaching system will most certainly appreciate the feedback.
Profile ID: LFUS-PAI-O-1417373