Data classification apparatus and method thereof

Data processing: artificial intelligence – Neural network – Learning task

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C706S012000

Reexamination Certificate

active

07072873

ABSTRACT:
The data classification apparatus and method is adapted to high-dimensional classification problems and provide a universal measure of confidence that is valid under the iid assumption. The method employs the assignment of strangeness values to classification sets constructed using classification training examples and an unclassified example. The strangeness values of p-values are compared to identify the classification set containing the most likely potential classification for the unclassified example. The measure of confidence is then computed on the basis of the strangeness value of the classification set containing the second most likely potential classification.

REFERENCES:
patent: 5315313 (1994-05-01), Shinagawa
patent: 5361379 (1994-11-01), White
patent: 5479573 (1995-12-01), Keeler et al.
patent: 5577166 (1996-11-01), Mizuno
patent: 5608841 (1997-03-01), Tsuboka
patent: 5625748 (1997-04-01), McDonough et al.
patent: 5640492 (1997-06-01), Cortes et al.
patent: 5649068 (1997-07-01), Boser et al.
patent: 5846189 (1998-12-01), Pincus
patent: 6278970 (2001-08-01), Milner
patent: 0450825 (1991-10-01), None
patent: 450825 (1991-10-01), None
patent: 2080072 (1982-01-01), None
patent: 2369899 (2002-06-01), None
G.A. Carpenter, W.D. Ross; “ART-EMAP: A neural network architecture for object recognition by evidence accumulation”; IEEE Transactions on Neural Networks; vol. 6, Iss. 4, Jul. 1995; pp. 805-818.
C.H. Wu, G.M. Whitson, C.-T. Hsiao, C.-F. Huang; “Classification Artificial Neural Systems for Genome Research”;Proceedings of the 1992 ACM/IEEE Conference on Supercomputing; Dec. 1992; pp. 797-803.
Carpenter et al; ART-EMAP: A neural network architecture for object recognition by evidence accumulation; IEEE Transactions on Neural Networks; vol. 6, Is. 4; Jul. 1995, pp. 805-818.
Wu et al; Classification artificial neural systems for genome research; Proceedings Supercomputing; Nov. 16-20, 1992; pp. 797-803.
Proedrou et al; Transductive Confidence Machines for Pattern Recognition; ECML LNAI 2430; 2002; pp. 381-390.
Melluish et al; Comparing the Bayes and typicalness frameworks; Proceedings of ECML 01, Lecture Notes; 2001; pp. 1-13.
Feyh; Statistics of maximum entropy IID noise given its cumulants; Conference Record of The Twenty-Sixth Asilomar Conference on Signals, Systems and Computers; vol. 2; Oct. 26-28, 1992; pp. 736-740.
Kulkarni et al; On the existence of strongly consistent rules for estimation and classification; IEEE International Symposium on Information Theory Proceedings; Sep. 17-22, 1995; pp. 255.
A. Gammerman, V. Vovk, V. Vapnik:Learning by Transduction.
Uncertainty in Artificial Intelligence. Proceedings of the Fourteenth Conference (1998).
Gammerman et al.Learning by Transduction, Uncertainty in Artificial Intelligence. Proceedings of the Fourteenth Conference (1998), Proceedings of Uncertainty in Artificial Intelligence (UAI-98), Madison, WI, USA, Jul. 24-26, 1998, pp. 148-155, XP000869654 1998, San Francisco, CA, USA, Morgan Kaufman Publishers, USA ISBN: 1-55860-555-X.
Backer and Duin,Statistiche PatroonherkennungISBN 9065621059, 1989 pp. 129-137.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Data classification apparatus and method thereof does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Data classification apparatus and method thereof, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Data classification apparatus and method thereof will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-3607135

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.