Patent
1989-10-04
1992-03-10
MacDonald, Allen R.
395 23, 395 24, G06F 1518
Patent
active
050954431
ABSTRACT:
A neural network structure includes input units for receiving input data, and a plurality of neural networks connected in parallel and connected to the input units. The plurality of neural networks learn in turn correspondence between the input data and teacher data so that the difference between the input data and the teacher becomes small. The neural network structure further includes output units connected to the plurality of neural networks, for outputting a result of learning on the basis of the results of learning in the plurality of neural networks.
REFERENCES:
patent: 4760604 (1988-07-01), Cooper et al.
patent: 4805225 (1989-02-01), Clark
patent: 4858147 (1989-08-01), Conwell
patent: 4912655 (1990-03-01), Wood
Kollias et al., "Adaptive Training of Multilayer Neural Networks Using a Least Squares Estimation Technique", IEEE International Conference on Neural Networks, San Diego, CA, Jul. 24-27 1988, pp. I-383 to I-390.
Lippmann, "An Introduction to Computing with Neural Nets", IEEE ASSP Magazine, Apr. 1987, pp. 4-22.
D. E. Rumelhart, "Learning Internal Representations by Error Propagation", Mit Press, pp. 318-362, 1986.
Y. Anzai, "Connectionist Models and Recognition Information Processing", pp. 1-27, Dec. 11, 1987.
Joyner Roger S.
MacDonald Allen R.
Ricoh & Company, Ltd.
LandOfFree
Plural neural network system having a successive approximation l does not yet have a rating. At this time, there are no reviews or comments for this patent.
If you have personal experience with Plural neural network system having a successive approximation l, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Plural neural network system having a successive approximation l will most certainly appreciate the feedback.
Profile ID: LFUS-PAI-O-2289576