Method for feature selection and for evaluating features...

Data processing: artificial intelligence – Neural network – Learning task

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C706S015000, C706S016000, C706S021000

Reexamination Certificate

active

07970718

ABSTRACT:
A group of features that has been identified as “significant” in being able to separate data into classes is evaluated using a support vector machine which separates the dataset into classes one feature at a time. After separation, an extremal margin value is assigned to each feature based on the distance between the lowest feature value in the first class and the highest feature value in the second class. Separately, extremal margin values are calculated for a normal distribution within a large number of randomly drawn example sets for the two classes to determine the number of examples within the normal distribution that would have a specified extremal margin value. Using p-values calculated for the normal distribution, a desired p-value is selected. The specified extremal margin value corresponding to the selected p-value is compared to the calculated extremal margin values for the group of features. The features in the group that have a calculated extremal margin value less than the specified margin value are labeled as falsely significant.

REFERENCES:
patent: 4881178 (1989-11-01), Holland
patent: 5138694 (1992-08-01), Hamilton
patent: 5255347 (1993-10-01), Matsuba et al.
patent: 5649068 (1997-07-01), Boser
patent: 5809144 (1998-09-01), Sirbu
patent: 5819246 (1998-10-01), Ashida et al.
patent: 5921937 (1999-07-01), Davis et al.
patent: 5950146 (1999-09-01), Vapnik
patent: 6087098 (2000-07-01), McKiernan et al.
patent: 6112195 (2000-08-01), Burges
patent: 6128608 (2000-10-01), Barnhill
patent: 6134344 (2000-10-01), Burges
patent: 6157921 (2000-12-01), Barnhill
patent: 6161130 (2000-12-01), Horvitz et al.
patent: 6187549 (2001-02-01), Schmidt et al.
patent: 6192360 (2001-02-01), Dumais et al.
patent: 6251586 (2001-06-01), Mulshine et al.
patent: 6327581 (2001-12-01), Platt
patent: 6427141 (2002-07-01), Barnhill
patent: 6633857 (2003-10-01), Tipping
patent: 6647341 (2003-11-01), Golub
patent: 6658395 (2003-12-01), Barnhill
patent: 6714925 (2004-03-01), Barnhill
patent: 6760715 (2004-07-01), Barnhill
patent: 6789069 (2004-09-01), Barnhill
patent: 6879944 (2005-04-01), Tipping et al.
patent: 6882990 (2005-04-01), Barnhill
patent: 6944602 (2005-09-01), Cristianini
patent: 6996549 (2006-02-01), Barnhill et al.
patent: 7047137 (2006-05-01), Kasif et al.
patent: 7117188 (2006-10-01), Guyon et al.
patent: 7299213 (2007-11-01), Cristianini
patent: 7318051 (2008-01-01), Weston et al.
patent: 7475048 (2009-01-01), Weston et al.
patent: 2003/0036081 (2003-02-01), Adorjan
patent: 2004/0102905 (2004-05-01), Adorjan
patent: 2005/0131847 (2005-06-01), Weston et al.
patent: WO02/095534 (2002-11-01), None
patent: WO 02/095534 (2002-11-01), None
Jain et al., “Statistical Pattern Recognition: a Review”, IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 2, No. 1, 2000, pp. 4-37.
Alon, U., et al., “Broad patterns of gene expression revealed by clustering analysis of tumor and normal colon tissues probed by oligonucleotide arrays”,Proc. Natl. Acad. Sci. USA, Jun. 1999, pp. 6745-6750, vol. 96, Cell Biology.
Blum, A.L., et al., “Selection of Relevant Features and Examples in Machine Learning”,Artificial Intelligence, Dec. 1997, pp. 245-271, vol. 97.
Bredensteiner, E.J., et al., “Multicategory Classification by Support Vector Machines”,Computation Optimizations and Applications, 1999, pp. 53-79, vol. 12.
Brown, M.P.S., et al., “Knowledge-based analysis of microarray gene expression data by using support vector machines”,Proc. Natl. Acad. Sci. USA, Jan. 4, 2000, pp. 262-267, vol. 97, No. 1.
Devijver, P., et al.,Pattern Recognition. A Statistical Approach, 1982, pp. 218-219, Prentice-Hall International, London.
Furey, T.S., et al., “Support vector machine classification and validation of cancer tissue samples using microarray expression data”,Bioinformatics, 2000, pp. 906-914, vol. 16, No. 10.
Golub, T.R., et al., “Molecular Classification of Cancer: Class Discovery and Class Prediction by Gene Expression Monitoring”,Science, Oct. 15, 1999, pp. 531-537, vol. 286.
Guyon, I., et al., “An Introduction to Variable and Feature Selection”,Journal of Machine Learning Research, 2003, pp. 1157-1182, vol. 3.
Hastie, T., et al., “Gene Shaving: a New Class of Clustering Methods for Expression Arrays”, Technical Report, Stanford University, 2000, pp. 1-40.
Kohavi, R., “The Power of Decision Tables”,European Conference on Machine Learning(ECML), 1995, 16 pages.
Kohavi, R., and John, G.H., “Wrappers for Feature Subset Selection”,Artificial Intelligence, Dec. 1997, pp. 273-324, vol. 97, Issue 1-2, Special issue on relevance.
Le Cun, Y., et al., “Optimal Brain Damage”,Advances in Neural Information Processing Systems 2, 1990, pp. 598-605.
Weston, J., et al., “Feature Selection for SVMs”,Proc. 15thConference on Neural Information Processing Systems(NIPS), 2000, pp. 668-674.
Zhang, X. and Wong, W., “Recursive Sample Classification and Gene Selection based on SVM: Method and Software Description”, Technical Report, Department of Biostatistics, Harvard School of Public Health, 2001, 5 pages.
Gupta, P., et al., “Beam Search for Feature Selection in Automatic SVM Defect Classification”,16thInternational Conference on Pattern Recognition(ICPR'02), Aug. 2002, p. 20212, vol. 2 (abstract only).
Adorjan, P., et al. “Tumour class prediction and discovery by microarray-based DNA methylation analysis”,Nucleic Acids Research, 2002, pp. 1-9, vol. 30, No. 5 e21.
Alizadeh, A., et al. “Distinct types of diffuse large B-cell lymphoma identified by gene expression profiling”,Nature, Feb. 2000, pp. 503-511, vol. 403.
Model, F., et al., “Feature Selection for DNA Methylation based Cancer Classification”,Bioinformatics Discovery Note, 2001, pp. 1-8, vol. 1.
Schölkopf, B., et al., “Input Space Versus Feature Space in Kernel-Based Methods”,IEEE Transactions on Neural Networks, Sep. 1999, pp. 1000-1017, vol. 10.
Steiner, G., et al., “Discriminating Different Classes of Toxicants by Transcript Profiling”Environmental Health Perspectives,Aug. 2004, pp. 1236-1248, vol. 112.
Weston, J., et al., “Use of the Zero-Norm with Linear Models and Kernel Methods”,Journal of Machine Learning Research, 2003, pp. 1439-1461, vol. 3.
Li, Y., et al., “Bayesian automatic relevance determination algorithms for classifying gene expression data”,Bioinformatics, 2002, pp. 1332-1339, vol. 18.
Morik, K., et al., “Combining statistical learning with a knowledge-based approach—A case study in intensive care monitoring”Proc. 16thInt'l Conf. on Machine Learning(ICML-99), 1999, pp. 268-277.
Peng, S., et al., “Molecular classification of cancer types from microarray data using the combination of genetic algorithms and support vector machines”,FEBS Letters, 2003, pp. 358-362, vol. 555.
Ramaswamy, S., et al., “Multiclass cancer diagnosis using tumor gene expression signatures”,Proc. Natl. Acad. Sci. USA, Dec. 2001, pp. 15149-15154, vol. 98.
Guyon, I., et al., “Gene Selection for Cancer Classification using Support Vector Machines”,Machine Learning, 2002, pp. 389-422, vol. 46.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Method for feature selection and for evaluating features... does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Method for feature selection and for evaluating features..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Method for feature selection and for evaluating features... will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-2723187

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.