Tree-like perceptron and a method for parallel distributed train

Patent

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

395 21, 395 23, G06E 100, G06E 300

Patent

active

055925898

ABSTRACT:
Constraints placed on the structure of a conventional multi-layer network consequently enable learning rules to be simplified and the probability of reaching only local minima to be reduced. These constraints include neurons which are either inhibitory or excitatory. Also, for each neuron in the hidden layer, there is at most one synapse connecting it to a corresponding neuron in the output layer. The result is a tree-like structure which facilitates implementation of large scale electronic networks, and allows for parallel training of parts of the network. Additionally, each neuron in the hidden layer receives a reinforcement signal from its corresponding neuron in the output layer which is independent of the magnitude of synapses posterior to the hidden layer neuron. There may be multiple hidden layers, wherein each layer has a plurality of neurons, and wherein each neuron in an anterior layer connects to only one neuron in any posterior layer. In training, weights of synapses connected anterior to any neuron are adjusted with the polarity opposite the polarity of the error signal when the polarity determined for the path for the neuron is inhibitory. The adjustment is made with the polarity of the error signal when the polarity determined for the path for the neuron is excitatory.

REFERENCES:
patent: 3950733 (1976-04-01), Cooper et al.
patent: 4326259 (1982-04-01), Cooper et al.
patent: 4760604 (1988-07-01), Cooper et al.
patent: 4802103 (1989-01-01), Faggin et al.
patent: 4807168 (1989-02-01), Moopenn et al.
patent: 4951239 (1990-08-01), Andes et al.
patent: 5033006 (1991-07-01), Ishizuka et al.
patent: 5045713 (1991-09-01), Shima
patent: 5056037 (1991-10-01), Eberhardt
patent: 5060278 (1991-10-01), Fukumizu
patent: 5087826 (1992-02-01), Holler et al.
patent: 5095443 (1992-03-01), Watanabe
patent: 5165010 (1992-11-01), Masuda et al.
patent: 5239594 (1993-08-01), Yoda
patent: 5247206 (1993-09-01), Castro
patent: 5253329 (1993-10-01), Villarreal et al.
patent: 5402522 (1995-03-01), Alkon et al.
P. D. Wasserman & T. Schwartz, "Neural Networks: Part 2: What are they and why is everybody so interested in them now?", IEEE Expert, pp. 10-15, Spring 1988.
A. F. Murray & A. V. W. Smith, "Asynchronous VLSI Neural Networks Using Pulse-Stream Arithmetic", IEEE Journal of Solid-State Circuits, vol. 23, No. 3, Jun. 1988, pp. 688-697.
A. F. Murray et al., "Asynchronous VLSI Neural Networks Using Pulse-Stream Arithmetic", IEEE Journal of Solid State Circuits, vol. 23, Jun. 1988, pp. 56-65.
H. P. Graf et al., "VLSI Implementation of a Neural Network Model" IEEE Computer, Mar. 1988.
R. R. Leighton, "The Aspirin/Migraines Neural Network Software", User's Manual, Release V6.0, Mitre Corporation, Oct. 29, 1992.
S. Y. Kung et al., "A Unified Systolic Architecture for Artificial Neural Networks", Journal of Parallel & Distributed Computing, vol. 6, pp. 358-387 (1989).
A. G. Barto et al., "Gradient Following Without Back-Propagation In Layered Networks", Proc. IEEE First Annual Int'l Conference on Neural Networks. II: 629-636 (San Diego, CA, 1987).
R. Hecht-Nielsen, "Theory of the Backpropagation Neural Network", Proc. Int. Joint Conf. Neural Networks, vol. 1, pp. 593-605, 1989.
F. Crick et al., "Certain Aspects of the Anatomy and Physiology of the Cerebral Cortex", In: Parallel Distributed Processing: Explorations in the Microstructure of Cognition. vol. 2 (eds Rumelhart, D. E., McClelland, J. L. and the PDP Research Group), pp. 333-371 (MIT Press, Cambridge, MA 1986).
D. E. Rumelhart et al., "Learning Internal Representations by Error Propagation", In: Parallel Distributed Processing: Explorations in the Microstructure of Cognition. vol. 1. (eds Rumelhart, D. E. and McClelland, J. L. and the PDP Research Group) pp. 318-362 (MIT Press, Cambridge, MA, 1986).
T. H. Brown et al., "Hebbian Synapses: Biophysical Mechanisms and Algorithms", Annu. Rev. Neurosci. 1990, vol. 13, pp. 475-511.
S. T. Kim et al., "Algorithmic Transformations for Neural Computing and Performance of Supervised Learning on a Dataflow Machine", IEEE Trans. on Software Engineering, vol. 18, No. 7, Jul. 1992, pp. 613-623.
W.-M. Lin et al., "Algorithmic Mapping of Neural Network Models onto Parallel SIMD Machines", IEEE Trans. on Computers, vol. 40, No. 12, Dec. 1991, pp. 1390-1401.
A. Blumer et al., "Learnability and the Vapnik-Chervonenkis Dimension", Journal of the Association for Computing Machinery, vol. 36, No. 4, Oct. 1989, pp. 929-965.
R. D. Hawkins et al., "A Cellular Mechanism of Classical Conditioning in Aplysia: Activity-Dependent Amplification of Presynaptic Facilitation", Science, vol. 219, 28 Jan. 1993.
P. Mazzoni et al., "A more biologically plausible learning rule for neural networks", Proc. Natl. Acad. Sci. USA, vol. 88, pp. 4433-4437, May 1991, Neurobiology.
J. Hart, Jr. et al., "Neural subsystems for object knowledge", Nature, vol. 359, 3 Sep. 1992, pp. 61-65.
F. Crick, "The recent excitement about neural networks", Nature, vol. 337, 12 Jan. 1989, pp. 129-132.
K. Hornik, "Approximation Capabilities of Multilayer Feedforward Networks", Neural Networks, vol. 4, pp. 251-257, 1991.
B. Giraud et al., "Optimal Approximation of Square Integrable Functions by a Flexible One-Hidden-Layer Neural Networks of Excitatory and Inhibitory Neuron Pairs", Neural Networks, vol. 4, pp. 803-815, 1991.
T. J. Sejnowski et al., "Parallel Networks that Learn to Pronounce English Text", Complex Systems, vol. 1, (1987), pp. 145-168.
M. Gori et al.,"On the Problem of Local Minima in Back-Propagation", IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 14, No. 1, Jan. 1992, pp. 76-85.
J. F. Kolen et al., "Learning in Parallel Distributed Processing Networks: Computational Complexity and Information Content", IEEE Transactions and Systems, Man, and Cybernetics, vol. 21, No. 2, Mar./Apr. 1991, pp. 359-367.
P. S. Goldman-Rakic, "Changing Concepts of Cortical Connectivity: Parallel Distributed Cortical Networks", Neurobiology of Neocortex, pp. 177-202 (eds. P. Rakie et al., John Wiley & Sons, 1988.).
M. S. Livingstone et al., "Psychophysical Evidence for Separate Channels for the Perception of Form, Color, Movement, and Depth", Journal of Neuroscience, Nov. 1987, 7(11): pp. 3416-3466.
I. McLaren, "The Computational Unit as an Assembly of Neurones: an Implementation of an Error Correcting Learning Algorithm", In: The Computing Neuron (eds. Durbin, R., Miall, C., and Mitchison, G.) pp. 160-179 (Addison-Wesley, Reading, MA, 1989).
J. C. Houk et al., "Distributed Sensorimotor Learning", pp. 1-28, to appear in Tutorials in Motor Behavior II, G. Stelmach et al., eds. Elsevier Science Publishers, B.U., Amsterdam, the Netherlands, 1992.
S. Zeki et al., "The functional logic of cortical connections", Nature, vol. 335, 22 Sep. 1988, pp. 311-317.
J. S. Vitter et al., "Learning in Parallel", Information and Computation, vol. 96, pp. 179-202, (1992).
R. Durbin et al., "Product Units: A Computationally Powerful and Biologically Plausible Extension to Backpropagation Networks", Neural Computation, vol. 1, pp. 133-142, (1989).
T. Carney et al., "Parallel processing of motion and colour information", Nature, vol. 328, 13 Aug. 1987, pp. 647-649.
D. H. Ballard et al., "Parallel visual computation", Nature, vol. 306, 3 Nov. 1983, pp. 21-26.
M. Jabri et al., "Weight Perturbation: An Optimal Architecture & Learning Technique for Analog VLSI Feedforward and Recurrent Multilayer Networks", Neural Computation, vol. 3, pp. 546-565, (1991).
Y. Qiao et al., "Local learning algorithm for optical neural networks", Applied Optics, vol. 31, No. 17, 10 Jun. 1992, pp. 3285-3288.
J. H. Byrne, "Cellular Analysis of Associative Learning", Physiological Reviews, vol. 67, No. 2, Apr. 1987, pp. 329-439.
L. M. Reyneri et al., "An Analysis on the Performance of Silicon Implementations of Backpropagation Algorithms for Artificial Neural Networks", IEEE Transactions on Computers, vol. 40, No. 12 Dec. 1991, pp. 1380-1389.
M. Constantine-Paton, "NMDA Receptor as a Mediator of Activity-dependent Synaptogenesis in the Developing Brain", Cold Spring Harbor Symposium on Quan

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Tree-like perceptron and a method for parallel distributed train does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Tree-like perceptron and a method for parallel distributed train, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Tree-like perceptron and a method for parallel distributed train will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-1772127

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.