Using histograms to introduce randomization in the...

Data processing: database and file management or data structures – Database design – Data structure types

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

Reexamination Certificate

active

06859804

ABSTRACT:
A system for decision tree ensembles that includes a module to read the data, a module to create a histogram, a module to evaluate a potential split according to some criterion using the histogram, a module to select a split point randomly in an interval around the best split, a module to split the data, and a module to combine multiple decision trees in ensembles. The decision tree method includes the steps of reading the data; creating a histogram; evaluating a potential split according to some criterion using the histogram, selecting a split point randomly in an interval around the best split, splitting the data, and combining multiple decision trees in ensembles.

REFERENCES:
patent: 5047842 (1991-09-01), Bouman et al.
patent: 5787274 (1998-07-01), Agrawal et al.
patent: 5787425 (1998-07-01), Bigus
patent: 5799311 (1998-08-01), Agrawal et al.
patent: 5899992 (1999-05-01), Iyer et al.
patent: 6055539 (2000-04-01), Singh et al.
patent: 6675164 (2004-01-01), Kamath et al.
patent: 6750864 (2004-06-01), Anwar
patent: 20030061213 (2003-03-01), Yu et al.
patent: 20030061228 (2003-03-01), Kamath et al.
patent: 20030065535 (2003-04-01), Karlov et al.
Wang et al, CMP: A Fast Decision Tree Classifier Using Multivariate Predictions, PROC 16thIntl Conf on Data Engineering, San Diego, CA Feb. 28-Mar., 3, 2000, pp449-460.*
Baur, E.,et al., “An Empirical Comparison of Voting Classification Algorithms: Bagging, Boosting, and Variants,” Machine Learning, 36, (1999), pp. 105-142, Kluwer Academic Publishers, Boston.
Dietterich, T., “An Experimental Comparison of Three Methods for Constructing Ensembles of Decision Trees: Bagging, Boosting, and Randomization,” Machine Learning, (1998) pp. 1-22, Kluwer Academic Publishers, Boston.
Dietterich, T., “An Experimental Comparison of Three Methods for Constructing Ensembles of Decision Trees: Bagging, Boosting, and Randomization,” Machine Learning, (1999) pp. 1-22, Kluwer Academic Publishers, Boston.
Kamath, C., et al., “Approximate Splitting for Ensembles of Trees Using Histograms,” Lawrence Livermore National Laboratory, Preprint UCRL-JC-145576, Oct. 1, 2001, 18 pages.
Kamath, C, et al., “Classification of Bent-Double Galazies: Experiences with Ensembles of Decision Trees,” Lawrence Livermore National Laboratory, Feb. 22, 2002, pp. 1-7.
Alsabti, K., et al., “CLOUDS: A Decision Tree Classifier for Large Datasets,” Oct. 27, 1998, pp. 1-34.
Kamath, C., et al., “Creating ensembles of decision trees through sampling,” Lawrence Livermore National Laboratory, Preprint UCRL-JC-142268-REV-1, Aug. 15, 2001.
Opitz, D., et al., “Popular Ensemble Methods: An Empirical Study,” Journal of Artificial Intelligence Research 11, (1999) pp. 169-198, AI Access Foundation and Morgan Kaufmann Publishers.
Cantu-Paz, E., et al., “Using Evolutionary Algorithms to Induce Oblique Decision Trees,” Lawrence Livermore National Laboratory, Preprint UCRL-JC-137202, Jan. 21, 2000.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Using histograms to introduce randomization in the... does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Using histograms to introduce randomization in the..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Using histograms to introduce randomization in the... will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-3445239

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.