Preparing data for machine learning

Data processing: artificial intelligence – Machine learning

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C706S045000, C709S224000

Reexamination Certificate

active

07437334

ABSTRACT:
An apparatus and methods for feature selection and classifier builder are disclosed. The feature selection apparatus allows for removal of bias features. The classifier builder apparatus allows building a classifier using non-biased features. The feature selection methods disclosed teach how to remove bias features. The classifier builder methods disclosed teach how to build a classifier with non-biased features.

REFERENCES:
patent: 5845285 (1998-12-01), Klein
patent: 6192360 (2001-02-01), Dumais et al.
patent: 6701333 (2004-03-01), Suermondt et al.
patent: 6728689 (2004-04-01), Drissi et al.
patent: 2002/0161761 (2002-10-01), Forman et al.
patent: 2003/0018658 (2003-01-01), Suermondt et al.
patent: 2004/0059697 (2004-03-01), Forman
patent: 2004/0064464 (2004-04-01), Forman et al.
patent: 2004/0093315 (2004-05-01), Carney
patent: 2004/0148266 (2004-07-01), Forman
George Forman “An Extensive Empirical Study of Feature Selection Metrics for Text Classification” The Journal of Machine Learning Research, vol. 3 , Mar. 2003.
A Pitfall and Solution in Multi-Class Feature Selection for Text Classification. G. Forman. ICML'04. HPL-2004-86. SpreadFx/Round-Robin method.
An Extensive Empirical Study of Feature Selection Metrics for Text Classification. G. Forman. Special Issue on Variable and Feature Selection, Journal of Machine Learning Research, 3(Mar):1289-1305, 2003. HPL-2002-147R1, abstract only.
Choose Your Words Carefully: An Empirical Study of Feature Selection Metrics for Text Classification. G. Forman. In the Joint Proceedings of the 13th European Conference on Machine Learning and the 6th European Conference on Principles and Practice of Knowledge Discovery in Databases (ECML/PKDD '02), Aug. 19-23, 2002. HPL-2002-88R2.
An Introduction to Variable and Feature Selection. Isabelle Guyon, Andre Elisseeff; JMLR 3(Mar):1157-1182, 2003.
Sin-Jae Kang , Sae-Bom Lee , Jong-Wan Kim and In-Gil Nam “Two Phase Approach for Spam Mail Filtering” Springer-Verlag Berlin Heidelberg 2004.
Huan Liu (“Evolving Feature selection” IEEE Intelligent systems 2005).
Forman, George, H., et al., U.S. Appl. No. 11/004,317, filed Dec. 3, 2004 (23 pages).
Abstract of Bordley, R.F. , et al., “Fuzzy Set Theory, Observer Bias and Probability Theory,”Fussy Sets Systems, vol. 33, No. 3, 1 page (1989).
Abstract of Ejima, T., et al., “Biased Clustering Method for Partially Supervised Classification,”Proc SPIE Int Soc Opt Engvol. 3304, 2 pages (1998).
Abstract of Feelders, A.J., et al., “Learning from Biased Data Using Mixture Models ,”KDD-96 Proceedings, 1 page (1996).
Abstract of Hall, L.O., “Data Mining of Extreme Data Sets: Very Large and or/ Very Skewed Data Sets,”Proc IEEE Int Conf Syst Man Cybern, vol. 1, 1 page (2001).
Abstract of Kayacik, H.G., et al., “On Dataset Biases In a Learning System with Minimum a priori Information for Instrusion Detection,”Proceedings on the 2nd Annual Conference on Communication Networks and Services Research, 1 page (2004).
Abstract of SubbaNarasimha, P.N., et al., “Predictive Accuracy of Artifical Neural Networks and Multiple Regression in the Case of Skewed Data: Exploration of Some Issues” Expert Systems with Applications, vol. 19, No. 2, 1 page (2000).
Abstract of Zhu, H., et al., “Training Algorithm for Multilayer Neural Networks of Hard-Limiting Units with Random Bias,”IEICE Transactions on Fundamentals of Electronics, Communications, and Computer Sciences, vol. E83-A, No. 6, 1 page (2000).
Buckley, J.J., “Training a Fuzzy Neural Net,”Proceedings of the 1994 1st International Conference of NAFIPS/IFIS/NASA, pp. 73-77 (1994).
Ghosn, J., “Bias Learning, Knowledge Sharing,”IEEE Transactions on Neural Networks, vol. 14, No. 4, pp. 748-765 (Jul. 2003).
Lisboa, P.J.G., et al., “Bias Reduction in Skewed Binary Classification with Bayesian Neural Networks,”Neural Networks, vol. 13, pp. 407-410 (2000).
Snyders, S., et al., “What Inductive Bias Gives Good Neural Network Training Performance,”Proceedings of the International Joint Conference on Neural Networks, 8 pages total (2000).
Sugiyama, M., et al., “Incremental Active Learning with Bias Reduction,”Proceedings of the International Joint Conference on Neural Networks, 6 pages total (2000).
Tetko, I.V., “Associative Neural Network,” Internet: <http://cogprints.org/1441/>, pp. 1-15 (2001).
Weiss, S.M., et al.,Predictive Data Mining, A Practical Guide, pp. 74-78 (1997).

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Preparing data for machine learning does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Preparing data for machine learning, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Preparing data for machine learning will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-4013685

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.