Rapid category learning and recognition system

Image analysis – Histogram processing – For setting a threshold

Patent

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

382 10, 395 23, 395 25, G06K 962, G06K 900

Patent

active

051577380

ABSTRACT:
An improved ART2 network provides fast and intermediate learning. The network combines analog and binary coding functions. The analog portion encodes the recent past while the binary portion retains the distant past. LTM weights that fall below a threshold remain below threshold at all future times. The suprathreshold LTM weights track a time average of recent input patterns. LTM weight adjustment (update) provides fast commitment and slow recoding. The network incorporates these coding features while achieving an increase in computational efficiency of two to three orders of magnitude over prior analog ART systems.

REFERENCES:
patent: 4914708 (1990-04-01), Carpenter et al.
patent: 4941122 (1990-07-01), Weideman
patent: 5054093 (1991-10-01), Cooper et al.
"The Art of Adaptive Pattern Recognition by a Self-Organizing Neural Network", by Gail A. Carpenter and Stephen Grossberg, in Computer, Mar. 1988, pp. 77-88.
"Art 2: Self-Organization of Stable Category Recognition Codes for Analog Input Patterns", Applied Optics, vol. 26, No. 23, Dec. 1987, pp. 4919-4930.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Rapid category learning and recognition system does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Rapid category learning and recognition system, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Rapid category learning and recognition system will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-199403

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.