Signature recognition apparatus which can be trained with a redu

Image analysis – Applications – Personnel identification

Patent

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

382157, G06K 962

Patent

active

055531563

ABSTRACT:
A signature recognition apparatus reduces the volume of training data needed and shortens the learning period. In the apparatus, a sample generating section generates sample data. A coupling load coefficient is determined based on the sample data, thereby obviating the need for additional sample data. The apparatus also uses a fuzzy net which implements a linear function in its output layer to shorten the learning period relative to the learning period required for a net implementing a non-linear function such as a sigmoid.

REFERENCES:
patent: 4979126 (1990-12-01), Pao et al.
patent: 5073867 (1991-12-01), Murphy et al.
patent: 5161204 (1992-11-01), Hutcheson et al.
patent: 5271090 (1993-12-01), Boser
patent: 5359699 (1994-10-01), Tong et al.
patent: 5390261 (1995-02-01), Huang et al.
Roan et al., "Fuzzy RCE Neural Network", Fuzzy Systems, International Conference 1993 (IEEE Jul. 1993), pp. 629-634.
Reed et al., "Regularization Using Jittered Training Data", Neural Networks, International Conference 1992 (IEEE), pp. 147-152.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Signature recognition apparatus which can be trained with a redu does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Signature recognition apparatus which can be trained with a redu, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Signature recognition apparatus which can be trained with a redu will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-1957769

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.