Patent
1994-02-18
1996-10-08
Davis, George B.
395 22, 395 20, G06F 1518
Patent
active
055641156
DESCRIPTION:
BRIEF SUMMARY
BACKGROUND OF THE INVENTION
This invention relates to an architecture for use in constructing artificial neural networks. Such networks comprise a plurality of artificial neuron-like devices, hereinafter referred to simply, as "neurons". The invention is particularly intended for use with a particular type of neuron known as a pRAM, and by way of introduction a brief discussion is given below of the construction and operation of a pRAM. However, it must be understood that the invention is of general application to the architecture of neural networks, and is not restricted to those where the neurons are pRAMs.
One of the known ways of realising a neuron in practice is to use a random access memory (RAM). The use of RAMs for this purpose dales back a considerable number of years. It has been suggested that if one were able to construct a RAM in which a given output, say a `1`, was produced by a given storage location with a probability between 0 and 1 (rather than with a probability of either 0 or 1 as in a conventional RAM), such a RAM would have a potential for constructing neural networks which mimicked more closely than hitherto the behaviour of physiological networks. (See Gorse, D., and Taylor, J. G., 1988, Phys. Lett. A. 131, 326-332; Gorse, D., and Taylor, J. G., 1989, Physica D, 34, 90-114) The term "pRAM", an abbreviation for "probabilistic RAM", is used there and herein for a RAM in which a given output is produced with a given probability between 0 and 1 when a particular storage location in the RAM in addressed, rather than with a probability of either 0 or 1 as in a conventional RAM.
In our copending International Patent Applications Nos. WO92/00572 and WO92/00573, and in a paper entitled "Hardware realisable models of neural processing", published in Proceedings of the First IEE International Conference on Artificial Neural Networks, 1989, pp 242-246 there is a description of how a pRAM may be constructed. There is described a device for use in a neural processing network, comprising a memory having a plurality of storage locations at each of which a number representing a probability is stored; means for selectively addressing each of the storage locations to cause the contents of the location to be read to an input of a comparator; a noise generator for inputting to the comparator a random number representing noise; and means for causing to appear at an output of the comparator an output signal having a first or second value depending on the values of the numbers received from the addressed storage location and the noise generator, the probability of the output signal having a given one of the first and second values being determined by the number at the addressed location.
One way in which a pRAM may be constructed is using a VLSI chip. However, such chips are relatively expensive, and it is presently impractical to fit more than one pRAM, or at most a few pRAMs, on to a single chip, given the substantial chip area which is required for the memory storage of each pRAM, its random number generator and the comparator. Neural networks of practical interest generally comprise a large number of neurons, so that using this approach a large number of VLSI chips would be required, with consequent high cost. The problem is accentuated when the neurons are provided with a learning capability, since that further increases the size of the neuron.
OBJECT AND SUMMARY OF THE INVENTION
It is an object of the present invention to provide an architecture which makes it possible to construct a neural network involving a substantial number of neurons, using VLSI chips, but in such a way as to reduce the cost significantly. The architecture of the present invention also has the potential for a high degree of flexibility in the connectivity of the neurons.
According to the present invention there is provided a neural network unit having a plurality of neurons, which network comprises a memory providing a plurality of storage locations for each of the neurons, and, in an integrated circuit, means for defining
REFERENCES:
patent: 4989256 (1991-01-01), Buckley
patent: 5063521 (1991-11-01), Peterson et al.
patent: 5151971 (1992-09-01), Jousselin et al.
patent: 5165009 (1992-11-01), Watanabe et al.
patent: 5167009 (1992-11-01), Skeirik
patent: 5175798 (1992-12-01), Taylor et al.
patent: 5197114 (1993-03-01), Skeirik
patent: 5293459 (1994-03-01), Duranton et al.
Yasunaga et al, "A Wafer Scale Integration Neural Network Utilizing Completely Digital Circuits", IJCNN, IEEE 1989.
Venta et al, "A Content-Addressing Software Method for the Emulation of Neural Networks", ICNN, IEEE 1988.
Wittie et al, "Micronet: A Reconfigurable Microcomputer network for Distributed Systems Research", Nov. 1978.
Garth et al, "A Chip Set for High Speed Simulation of Neural Network Systems", IEEE 1st Int Conf. on Neural Networks, Jun. 1987.
Ghosh et al, "Critical Issues in Mapping Neural Networks on Message-Passing Multicomputers", IEEE The 15th Annual Inter. Symposium on Computers Architecture, May-Jun. 1988.
Wike et al, "The VLSI Implemetation of STONN", IEEE, IJCNN, Jun. 1990.
Clarkson et al, "PRAM Automata", IEEE Inter. workshop on Cellular Neural Network and their application, 16-19 Dec. 1990.
Clarkson et al, "Hardware Realisable Models of Neural Processing", IEE Artificial Neural Networks, 1989.
Chambers et al, "Hardware Realisable Models of Neural Processing" IEEE Inter. Conf. on Neural Networks, 1989.
Davis George B.
King's College London
University College London
LandOfFree
Neural network architecture with connection pointers does not yet have a rating. At this time, there are no reviews or comments for this patent.
If you have personal experience with Neural network architecture with connection pointers, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Neural network architecture with connection pointers will most certainly appreciate the feedback.
Profile ID: LFUS-PAI-O-66774