Patent
1991-06-17
1993-05-25
Fleming, Michael R.
395 11, 395 21, G06F 1518
Patent
active
052147460
ABSTRACT:
A method and apparatus for training neural networks using evolutionary programming. A network is adjusted to operate in a weighted configuration defined by a set of weight values and a plurality of training patterns are input to the network to generate evaluations of the training patterns as network outputs. Each evaluation is compared to a desired output to obtain a corresponding error. From all of the errors, an overall error value corresponding to the set of weight values is determined. The above steps are repeated with different weighted configurations to obtain a plurality of overall error values. Then, for each set of weight values, a score is determined by selecting error comparison values from a predetermined variable probability distribution and comparing them to the corresponding overall error value. A predetermined number of the sets of weight values determined to have the best scores are selected and copies are made. The copies are mutated by adding random numbers to their weights and the above steps are repeated with the best sets and the mutated copies defining the weighted configurations. This procedure is repeated until the overall error values diminish to below an acceptable threshold. The random numbers added to the weight values of copies are obtained from a continuous random distribution of numbers having zero mean and variance determined such that it would be expected to converge to zero as the different sets of weight values in successive iterations converge toward sets of weight values yielding the desired neural network performance.
REFERENCES:
patent: 4912649 (1990-03-01), Wood
patent: 4912651 (1990-03-01), Wood et al.
patent: 4912652 (1990-03-01), Wood
patent: 4912654 (1990-03-01), Wood
patent: 4912655 (1990-03-01), Wood
patent: 4918618 (1990-04-01), Tomlinson, Jr.
patent: 4933871 (1990-06-01), DeSieno
patent: 5140530 (1992-08-01), Guha et al.
patent: 5150450 (1992-09-01), Swenson et al.
Lawrence J. Fogel, et al., "Artificial Intelligence Through Simulated Evolution", John Wiley & Sons, Inc., NY, N.Y., 1966.
David B. Fogel, "An Evolutionary Approach to the Traveling Salesman Problem", Biol. Cybern., 60, pp. 139-144 (1988).
A. N. Kolmogorov, "On the Representation of Continuous Functions of Many Variables by Superposition of Continuous Functions of One Function and Addition", Dokl Akad Navk SSSR, vol. 14, pp. 953-956 (1957).
Richard P. Lippmann, "An Introduction to Computing with Neural Nets", IEEE ASSP Magazine, Apr. 1987, pp. 4-22.
David J. Montana et al., "Training Feedforward Neural Networks Using Genetic Algorithms", Eleventh International Joint Conference on Artificial Intelligence, 1989, pp. 762-767.
D. E. Rumelhart et al., "Parallel Distributed Processing: Explorations in the Microstructure of Cognition", vol. 1, Foundations, MIT Press, Cambridge, Mass., 1986, pp. 322-328, 423-453, 472-487.
Whitley et al., "Optimizing Neural Networks Using Faster, More Accurate Genetic Search", ICGA'89, Jun. 1989, 391-396.
White, H., "Neural Network Learning and Statistics", AI Expert, Dec. 1989, 48-52.
Heistermann, J., "Learning in Neural Nets by Genetic Algorithms", Parallel Processing in Neural Systems and Computers, 1990, 165-168.
Caudill, M., "Evolutionary Neural Networks", AI Expert, Mar. 1991, 28-33.
Fogel David B.
Fogel Lawrence J.
Downs Robert W.
Fleming Michael R.
Orincon Corporation
LandOfFree
Method and apparatus for training a neural network using evoluti does not yet have a rating. At this time, there are no reviews or comments for this patent.
If you have personal experience with Method and apparatus for training a neural network using evoluti, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Method and apparatus for training a neural network using evoluti will most certainly appreciate the feedback.
Profile ID: LFUS-PAI-O-904412