Method and system for pattern analysis using a coarse-coded neur

Image analysis – Histogram processing – For setting a threshold

Patent

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

382 27, G06K 946

Patent

active

053332105

ABSTRACT:
A method and system for performing pattern analysis with a neural network coarse-code a pattern to be analyzed so as to form a plurality of sub-patterns collectively defined by data. Each of the sub-patterns comprises sets of sub-pattern data. The neural network includes a plurality of fields, each field being associated with one of the sub-patterns so as to receive the sub-pattern data therefrom. Training and testing by the neural network then proceeds in the usual way, with one modification: the transfer function thresholds the value obtained from summing the weighted products of each field over all sub-patterns associated with each pattern being analyzed by the system.

REFERENCES:
patent: 4802103 (1989-01-01), Faggin et al.
patent: 4803736 (1989-02-01), Grossberg et al.
patent: 5151951 (1992-09-01), Ueda et al.
Li et al., "Invariant Object Recognition Based on a Neural Network of Cascaded RCE Nets", 1990, vol. 2, pp. 845-854.
Rosen et al., "Adaptive Coarse-Coding for Neural Net Controllers", 1991, vol. 1, pp. 493-499.
Reid et al., "simultaneous position, scale, and rotation invariant pattern classification using third-order neural networks", Int. J. of Neural Networks, 1, 1989, pp. 154-159.
Reid et al., "Rapid Training of Higher-Order Neural Networks for Invariant Pattern Recognition", Proceedings of Joint Int. Conf. on Neural Networks, Washington, D.C. Jun. 18-22, 1989, vol. 1, pp. 689-692.
Spirkovska et al., "Connectivity Strategies for Higher-Order Neural Networks Applied to Pattern Recognition", Int. Joint Conf. on Neural Networks, San Diego, Calif., Jun. 17-21, 1990, vol. I, pp. 21-26.
Rosenfeld et al, "A Survey of Coarse-Coded Symbol Memories", Proc. of the 1988 Connectionist Models Summer School, Carnegie-Mellon Univ., Jun. 17-26, 1988, pp. 256-264.
Giles et al., "Encoding Geometric Invariances in Higher-Order Neural Networks", Neural Information Processing Systems, American Institute of Physics Conference Proceedings, 1988, pp. 301-309.
Giles et al., "Learning, Invariance, and Generalization in High-Order Neural Networks", Applied Optics, 1987, vol. 26, pp. 4972-4978.
Specht, "Probabilistic Neural Networks and the Polynomial Adaline as Complimentary Techniques for Classification", IEEE Transactions on Neural Networks, vol. 1, No. 1, pp. 111-121, Mar. 1990.
Lapedes et al., "Programming a Massively Parallel, Computation Universal System: Static Behavior", American Institute of Physics, pp. 283-298, Mar. 1986.
Lippmann, "An Introduction to Computing with Neural Nets", IEEE ASSP Magazine, pp. 4-22, Apr. 1987.
Yager, "On the Aggregation of Processing Units in Neural Networks", Machine Intelligence Institute, Iona College, pp. II-327-II-333.
Fukaya et al., "Two-Level Neural Networks: Learning by Interaction with Environment", IEEE First Int. Conf. on Neural Networks, Jun. 21, 1987.
Lippmann, "Pattern Classification Using Neural Networks", IEEE Communications Magazine, pp. 47-56,Nov. 1989.
Nielson, "Neurocomputing Applications: Sensor Processing, Control, and Data Analysis", Neurocomputing, 1990 Addison-Wesley.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Method and system for pattern analysis using a coarse-coded neur does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Method and system for pattern analysis using a coarse-coded neur, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Method and system for pattern analysis using a coarse-coded neur will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-1056954

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.