1990-04-30
1992-07-14
MacDonald, Allen R.
G06F 1518
Patent
active
051310724
DESCRIPTION:
BRIEF SUMMARY
BACKGROUND OF THE INVENTION
1. Field of the Invention
The present invention relates to a neuro computer and more particularly to a neuro computer realized by connecting an analog neuron chip through an analog time divisional transmission path.
2. Description of the Related Art
In a conventional sequential processing computer (Von Neumann type) it is difficult to control a data process function in accordance with a variation in the usage method or environment. Therefore, an adaptive data processing method utilizing a parallel distribution system and a layered network is proposed. The back propagation method (D. E. Rumelhart, G. E. Hinton, and R. J. Williams, "Learning Internal Representations by Error Propagation", PARALLEL DISTRIBUTED PROCESSING, Vol. 1, pp. 318-364, The MIT Press, 1986) receives particular attention because of its high practicality.
The back propagation method utilizes a layered structure network comprising a node called a basic unit and internal connection having weights presented. FIG. 1 shows the structure of a basic unit 1. Basic unit 1 carries out a process similar to a continuous neuron model. It comprises a multiple-input single-output system and further comprises a multiplication unit 2 for multiplying a plurality of inputs (Y.sub.h) by respective weights (W.sub.ih) of the internal connections, an accumulating unit 3 for adding all the multiplied results, and a threshold value processing unit 4 for outputting a final output X.sub.i by applying a nonlinear threshold value process to the added values.
FIG. 2 shows a conceptual view of the structure of a layered neural network. Many basic units (1-h, 1-i, 1-j) are connected in layers as shown in FIG. 2 and the output signal patterns corresponding to the input signal patterns are outputted.
Upon learning, the weights (W.sub.ih) of connections between respective layers are determined in order to minimize the difference between the output patterns and a target teacher pattern. This learning is applied to a plurality of input patterns and then multiplexed. Upon an association operation, even if the input pattern contains information which is slightly incomplete upon the learning and therefore different from the complete information input upon the learning, an output pattern close to the teacher pattern produced during the learning process is generated, thereby enabling a so-called associating process.
To realize a neuron computer with such a structure, transmission and reception of the data between basic units constituting a layered network is conducted by as small a number of wires as possible. This is a problem which needs to be solved when a complex data process is realized by forming multilayers of the network structure and increasing the number of basic units.
However, the data transmission system explained above requires a large number of wires between the two layers, thus preventing it from being made small. Further, its reliability cannot be increased when the layered network is manufactured into a chip. For example, consider a complete connection in which the number of adjacent layers is made the same and all the basic units 1 are connected to each other. In this case, the number of the wires increases in proportion to the second power of the number of basic units, thereby resulting in a rapid increase in the number of wires.
SUMMARY OF THE INVENTION
The present invention is made in consideration of the above problem and an object of the present invention is to provide a neuron computer capable of transmitting and receiving data between basic units forming a layered network with a minimum number of wires and further of performing a forward learning operation by using a set of analog neuron processors.
A neural network according to the present invention receives analog signals from a common first analog bus provided on the input side of respective layers in a time divisional manner, calculates a sum of the products of the input analog signals times digital weight data, and provides the analog signals to the second common anal
REFERENCES:
patent: 4660166 (1987-04-01), Hopfield
patent: 4906865 (1990-03-01), Holler
patent: 4947482 (1990-08-01), Brown
patent: 4974169 (1990-11-01), Engel
Holler et al., "An Electrically Trainable Artificial Neural Network (ETANN) with 10240 Floating Gate Synapses", Proc. Int. Ann. Conf. on Neural Networks, 1989, pp. II-191-196 (Jun. 18-22 1989).
Eberhardt et al., "Design of Parallel Hardware Neural Network Systems from Custom Analog VLSI `Building` Chips", Int. Joint Conf. on Neural Networks, vol. 2, pp. 183-190 (Jun. 18-22, 1989).
Schwartz, "A Neural Chips Survey", A1 Expert, Dec. 1990, pp. 34-39.
Lipmann, "An Introduction to Computer With Neural Nets", IEEE ASSP Magazine, Apr. 1987, pp. 41-47.
Houslander et al.,"Time-Multiplexed Analogue Circuit for Implementing Artificial Neural Networks", Electronics Letters, Nov. 10, 1988, pp.1413-1414, vol. 24, No. 23.
Yasunaga et al., "A Wafer-Scale Integration Neural Network Utilizing Completely Digital Circuits", Proc. Int. Joint Conf. on Neural Networks, vol. 2, pp. 213-217, Jun. 18-22, 1989.
Hansen, "A Time-Multiplexed Switched Capacitor Circuit for Neural Network Applications", IEEE Internat. Symposium on Circuits & Systems, 1989, vol. 3, pp. 2177-2180 May 8-11, 1989.
Asakawa Kazuo
Endo Hideichi
Iciki Hiroki
Ishikawa Katsuya
Iwamoto Hiromu
Fujitsu Ltd.
MacDonald Allen R.
LandOfFree
Neurocomputer with analog signal bus does not yet have a rating. At this time, there are no reviews or comments for this patent.
If you have personal experience with Neurocomputer with analog signal bus, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Neurocomputer with analog signal bus will most certainly appreciate the feedback.
Profile ID: LFUS-PAI-O-342253