Pattern recognizer with independent feature learning

Image analysis – Learning systems – Trainable classifiers or pattern recognizers

Patent

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

382157, 706 16, 706 25, 706 30, 706 20, G06K 962

Patent

active

060582065

ABSTRACT:
A pattern recognition device having modifiable feature detectors (28) which respond to a transduced input signal (26) and communicate a feature activity signal (30) to allow classification and an appropriate output action (70). A memory (40) stores a set of comparison patterns, and is used by an assigner (66) to find likely features, or parts, in the current input signal (26). Each part is assigned to a feature detector (28[m]) judged to be responsible for it. An updater (42) modifies each responsible feature detector (28[m]) so as to make its preferred feature more similar to its assigned part. The modification embodies a strong constraint on the feature learning process, in particular an assumption that the ideal features for describing the pattern domain occur independently. This constraint allows improved learning speed and potentially improved scaling properties.
A first preferred embodiment uses a group of noisy-OR type neural networks (50) to implement the feature detectors (28) and memory (40), and to obtain the parts by a soft segmentation of the current input signal (26). A second preferred embodiment maintains a lossless memory (40) separate from the feature detectors (28), and the parts consist of differences between the current input signal (26) and comparison patterns stored in the memory (40).

REFERENCES:
patent: 5060278 (1991-10-01), Fukumizu
patent: 5251268 (1993-10-01), Colley et al.
patent: 5359700 (1994-10-01), Seligson
patent: 5422981 (1995-06-01), Niki
patent: 5568591 (1996-10-01), Minot et al.
patent: 5754681 (1998-05-01), Watanable et al.
patent: 5812992 (1998-09-01), Vries
patent: 5822742 (1998-10-01), Alkon et al.
patent: 5835633 (1998-11-01), Fujisaki et al.
patent: 5870493 (1999-02-01), Vogl et al.
patent: 5870828 (1999-02-01), Yatsuzuka et al.
Oct. 1990 Foldiak "Forming Sparse Representations by Local Anti-Hebbian Learning", Biological Cybernetics.
Dec. 18, 1991 Schmidhuber "Learning Factorial Codes by Predictability Minimization", Univ. of Colorado Dept. of Computer Sci. TR-CU-CS-565-91.
Jul. 1996 Jaakkola & Jordan "Computing Upper and Lower Bounds on Likelihoods in Intractable Networks", in Proceedings of the Twelfth Conference on Uncertainty in AI.
Oct. 1992 Neal "Connectionist Learning of Belief Networks", Artificial Intelligence 56, pp. 71-113.
Dec. 1996 Lewicki & Sejnowski "Bayesian Unsupervised Learning of Higher Order Structure", Advances in Neural Information Processing Systems 9 (Proceedings of the 1996 Conference, Dec. 2-5).
May 1986 Rumelhart et. al. "Learning Internal Representations by Error Propagation", Parallel Distributed Processing vol. 1, MIT Press, Cambridge, MA.
Nov. 1994 Hastie et. al. "Learning Prototype Models for Tangent Distance", Advances in Neural Information Processing Systems 7 (Proceedings of the 1994 Conference, Nov. 28-Dec. 1.
Aug. 1990 Kortge, "Episodic Memory in Connectionist Networks", Proceedings of the Twelfth Annual Conference of the Cognitive Science Society, Lawrence Erlbaum Associates, Hillsdale, NJ.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Pattern recognizer with independent feature learning does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Pattern recognizer with independent feature learning, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Pattern recognizer with independent feature learning will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-1600247

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.