Data processing: artificial intelligence – Neural network – Learning method
Reexamination Certificate
2006-09-25
2010-02-23
Vincent, David R (Department: 2129)
Data processing: artificial intelligence
Neural network
Learning method
Reexamination Certificate
active
07668790
ABSTRACT:
A boosting—based method and system for fusing a set of classifiers that performs classification using weak learners trained on different views of the training data. The final ensemble contains learners that are trained on examples sampled with a shared sampling distribution. The combination weights for the final weighting rule are obtained at each iteration based on the lowest training error among the views. Weights are updated in each iteration based on the lowest training error among all views at that iteration to form the shared sampling distribution used at the next iteration. In each iteration, a weak learner is selected from the pool of weak learners trained on disjoint views based on the lowest training error among all views, resulting in a lower training and generalization error bound of the final hypothesis.
REFERENCES:
patent: 7099505 (2006-08-01), Li et al.
patent: 2004/0066966 (2004-04-01), Schneiderman
Freund, Y. and Schapire, R. “A Short Introduction to Boosting”, J. JSAI 14(5), 1999, pp. 771-778.
Bartu et al. “Classifier Fusion Using Shared Sampling Distribution for Boosting”, 5th ICDM, 2005.
Kuncheva, L. I., “A Theoretical Study on Six Classifier Fusion Strategies”, IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 24, No. 2, Feb. 2002, p. 281-286.
Lanckriet, G. R. G., “Learning the Kernel Matrix with Semidefinite Programming”, Journal of Machine Learning Research, vol. 5 (Jan. 2004), p. 27-72.
Barbu, C., “Classifier Fusion Using Shared Sampling Distribution for Boosting”, Proceedings of the Fifth IEEE International Conference on Data Mining, (ICDM'05), p. 34-41, Nov. 2005.
Barbu, C,. “Boosting in Classifier Fusion vs. Fusing Boosted Classifiers”, Proceedings of IEEE International Conference on Information Reuse and Integration, p. 332-337.
Barbu, C., “An Ensemble approach to Robust Biometrics Fusion”, Proceedings of the 2006 Conference on computer Vision and Pattern Recognition Workshop, (CVPRW'06), Jun. 2006, p. 56-63.
Kim, S. W., “On Using Prototype Reduction Schemes and Classifier Fusion Strategies to Optimize Kernel-Based Nonlinear Subspace Methods”, IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 27, No. 3, Mar. 2005, p. 455-460.
Xu, L., “Methods of Combining Multiple Classifiers and Their Applications to Handwriting Recognition”, IEEE Transactions on Systems, Man, and Cybernetics, vol. 22, No. 3, May/Jun. 1992, p. 418-435.
Phillips, F. J., “The Feret Evaluation Methodology for Face-Recognition Algorithms”, IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 22, No. 10, Oct. 2000, p. 1090-1104.
Wolpert, D. H., “Stacked Generalization”, Neural Networks, vol. 5, pp. 241-259, 1992.
Schapire, RE. E., “Boosting the Margin: A New Explanation for the Effectiveness of Voting Methods”, The Annals of Statistics, vol. 26, Issue 5, 1998, p. 1651-1686.
Rao, N. S. V., “On Fusers the Perform Better Than Best Sensor”, IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 23, No. 8, Aug. 2001, p. 904-909.
Schapire, R. E., “Improved Boosting Algorithms Using confidence-rated Predictions”, Machine Learning, vol. 37, p. 297-336 (1999).
Lanckriet, G. R. G., “Kernel-Based Data Fusion and Its Application to Protein Function Prediction in Yeast”, Pacific Symposium on Biocomputing, vol. 9, p. 300-311, 2004.
Freund, Y., “A Short Introduction to Boosting”, Journal of Japanese Society for Artificial Intelligence, vol. 14, Issue 5, p. 771-780, Sep. 1999.
Schapire, R. E., “The Boosting approach to Machine Learning an Overview”, Nonlinear Estimation and Classification, Springer, 2003, p. 1-23.
Breiman, L., “Arcing Classifiers”, The Annals of Statistic, vol. 26, No. 3, (Jun. 1998), pp. 801-824.
Freund, Y., “A Decision-Theoretic Generalization of On-Line Learning And an Application to Boosting”, Journal of Computer and System Sciences, vol. 55, p. 119-139 (1997).
Fan, W., “Is random model better? On its accuracy and efficiency”, Proceedings of the Third IEEE International Conference on Data Mining (ICDM'03), p. 51-58, Nov. 2003.
Hashem, S., “Optimal Linear Combinations of Neural Networks”, Neural Networks, vol. 10, 4, p. 599-614m 1997.
Kuncheva, L. I., “Is Independence Good for combining Classifiers?”, Proceedings of the 15th International Conference on Pattern Recognition, vol. 2, p. 168-171, Sep. 2000.
Kuncheva, L., “Decision templates for multiple classifier fusion: an experimental comparison”, 0031-3203/00 Pattern Recognition, vol. 34, p. 299-314, (2001).
Kittler, J., “On Combining Classifiers”, IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 20, No. 3, Mar. 1998, p. 226-239.
Kittler, J., “A Framework for Classifier Fusion: Is It Still Needed?” Lecture Notes in Computer Science, 1876, p. 45-56, 2000.
Barbu Costin
Lohrenz Maura C
Chang Li-Wu
Ferrett Sally A
Karasek John J
The United States of America as represented by the Secretary of
Vincent David R
LandOfFree
System and method for fusing data from different information... does not yet have a rating. At this time, there are no reviews or comments for this patent.
If you have personal experience with System and method for fusing data from different information..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and System and method for fusing data from different information... will most certainly appreciate the feedback.
Profile ID: LFUS-PAI-O-4222951