Binary tree for complex supervised learning

Data processing: artificial intelligence – Neural network – Learning task

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C700S104000

Reexamination Certificate

active

07133856

ABSTRACT:
The present invention provides a powerful and robust classification and prediction tool, methodology, and architecture for supervised learning, particularly applicable to complex datasets where multiple factors determine an outcome and yet many other factors are irrelevant to prediction. Among those features which are relevant to the outcome, they have complicated and influential interactions, though insignificant individual contributions. For example, polygenic diseases may be associated with genetic and environmental risk factors. This new approach allow us consider all risk factors simultaneously, including interactions and combined effects. Our approach has the strength of both binary classification trees and regression. A simple rooted binary tree model is created with each split defined by a linear combination of selected variables. The linear combination is achieved by regression with optimal scoring. The variables are selected using backward shaving. Cross-validation is used to find the level of shrinkage that minimizes errors. Using a selected variable subset to define each split not only increases interpretability, but also enhances the model's predictive power and robustness. The final model deals with cumulative effects and interactions simultaneously.

REFERENCES:
patent: 5263117 (1993-11-01), Nadas et al.
“An Introduction to Classification and Regression Tree (CART) Analysis”, Roger J. Lewis, MD, PhD, Department of Emergency Medicine, Harbor-UCLA Medical Center, Torrance, CA, 2000.
Leo Breiman, “Bagging Predictors,” Machine Learning 24, 123-140 (1996).
Alfred Lin et al., “Clustering and the Design of Preference-Assessment Surveys in Healthcare,” Health Services Research 34:5 Part I (Dec. 1999).3
Wei-Yin Loh et al., “Split selection methods for classification trees,” Statistica Sinica 7(1997), 815-840.
Lee-Ming Chuang et al., “Sibling-based association study of the PPARγ2Pro 12 Ala polymorphism and metabolic variables in Chinese and Japanese hypertension families: a SAPPHIRe study,” J Mol Med (2001) 79:656-664.
Lorene M. Nelson et al., “Recursive partitioning for the identification of disease risk subgroups: a case-control study of subarachnoid hemorrhage,”J Clin Epidemiol vol. 51, No. 3 pp. 199-209, 1998.
Wei-Yin Loh et al., “Tree-structured classification via generalized discriminant analysis,” Journal of the American Statistical Association, Sep. 1988, vol. 83, No. 403.
Philip A. Chou et al., “Optimal partitioning for classification and regression trees,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 13, No. 4, Apr. 1991.
Louis Gordon et al., “Asymptotically efficient solutions to the classification problem,” The Annals of Statistics 1987, vol. 6, No. 3, 515-533.
Gabor Lugcsi et al., “Consistency of data-driven histogram methods for density estimation and classification,” The Annals of Statistics 1996, vol. 24, No. 2,687-706.
Trevor Hastie et al., “Gene saving' as a method for identifying distinct sets of genes with similar expression patterns,” The electronic version can be found online at http//genomeniology.com/2000/1/2/research/0003/, Received Mar. 16, 2000, Revisions received May 16, 2000, Accepted May 18, 2000, Published Aug. 4, 2000.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Binary tree for complex supervised learning does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Binary tree for complex supervised learning, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Binary tree for complex supervised learning will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-3653918

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.