Data processing: artificial intelligence – Neural network – Structure
Patent
1997-12-08
2000-05-02
Hafiz, Tariq R.
Data processing: artificial intelligence
Neural network
Structure
706 25, 706 28, 706 31, 706 41, G06F 1518
Patent
active
06058386&
DESCRIPTION:
BRIEF SUMMARY
FIELD OF THE INVENTION
The present invention is directed to a device for designing a neural network, as well as to a neural network that can be produced in accordance with the device.
BACKGROUND INFORMATION
A customary method for presetting the parameters of neural networks is through initialization with random numbers and subsequent optimization. The disadvantages associated with this method are, however, that the neural networks that are obtained are not reproducible, and they do not always produce meaningful results when there is a substantially uneven data distribution. For that reason, even with the same training data, repeated calculations of the network parameters can lead each time to different parameter sets. In this context, it is difficult to compare the results to one another. When the parameters change, it cannot be determined on the basis of the non-reproducible results whether these changes are caused solely by changed training data.
PCT Published Application No. WO 94/06095 describes a device for designing a neural network that is capable of producing reproducible results for training data records. In this context, the parameters of the neural network are calculated by solving a linear system of equations. The design process can be roughly divided into two steps: first, equally distributed auxiliary quantities are introduced onto the domain of the input signals to define the parameters of the neurons in the intermediate layer; the parameters of the output neurons are then each determined by solving a system of equations. When, to assure the local effectiveness of the neurons, a function having a characteristic bell-shaped curve is used for the non-linear elements in the neurons of the intermediate layer, the known device has the drawback that the number of neurons required in the intermediate layer is dependent on the number of auxiliary quantities, so that a large number of neurons is required for a neural network that has satisfactory interpolation properties.
SUMMARY OF THE INVENTION
An object of the present invention is to provide a device for designing a neural network that, even with a smaller number of neurons in the intermediate layer, will produce a neural network having satisfactory interpolation properties.
The neural network design device of the present invention is capable of producing a neural network having a number of neurons in an intermediate layer that is optimally adapted to the distribution of the training data. In this context, this can be a three-layer network, i.e., a neural network having an input layer, an intermediate layer, and an output layer. The device can be used for any desired number of input and output neurons.
The device first subdivides the domains of the input signals independently of one another into subdomains. The division is carried out on the basis of the frequency distribution of the training data, which can also be described as supporting interpolation values. The subdomains can be selected to be equal in size in each case for the input signals and should not overlap one another. It is also possible, however, to establish a finer subdivision into subdomains in higher-frequency sections. With n input signals, the combinations of individual subdomains yield n-dimensional partial spaces of the domains. The training data are allocated to the partial spaces. The partial spaces are then sorted in the order of the number of training data they contain. The user can then assess the quality of the selected distribution based on the training data contained in the individual partial spaces. One criterion, for example, when working with training data from the same partial space, is how far apart the output signal values lie. Those partial spaces having the most training data are selected for further processing. Each of the selected partial spaces is represented by a neuron in an intermediate layer of the neural network. Thus, the number of neurons in the intermediate layer is set by the selection of the partial spaces. Each partial space contains at le
REFERENCES:
patent: 5201026 (1993-04-01), Tsuiki
patent: 5220618 (1993-06-01), Sirat et al.
patent: 5317675 (1994-05-01), Ikehara
patent: 5479574 (1995-12-01), Glier et al.
patent: 5701398 (1997-12-01), Glier et al.
patent: 5742741 (1998-04-01), Chiueh et al.
patent: 5751913 (1998-05-01), Chiueh et al.
patent: 5950181 (1999-09-01), Federl
Roan, Sing-Ming, Fuzzy RCE Neural Network, Second IEEE International Conference on Fuzzy Systems, New York, NY, Mar. 28, 1993. p.629-34, vol. 1 of 2.
Shyu, Haw-Jye, Classifying Seismic Signals via RCE Neural Network, International Joint Conference on Neural Networks, Piscataway, NJ, IEEE, Jun. 17, 1990, p. 101-105.
D. Casasent et al. "Adaptive Clustering Neural Net For Piecewise Nonlinear Discriminant Surfaces" International Joint Conference On Neural Networks pp. 423-428 (Jun. 1990).
Hafiz Tariq R.
Siemens Aktiengesellschaft
Starks, Jr. Wilbert L.
LandOfFree
Device for designing a neural network and neural network does not yet have a rating. At this time, there are no reviews or comments for this patent.
If you have personal experience with Device for designing a neural network and neural network, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Device for designing a neural network and neural network will most certainly appreciate the feedback.
Profile ID: LFUS-PAI-O-1601852