Data processing: artificial intelligence – Knowledge processing system – Knowledge representation and reasoning technique
Reexamination Certificate
2008-07-01
2008-07-01
Vincent, David (Department: 2129)
Data processing: artificial intelligence
Knowledge processing system
Knowledge representation and reasoning technique
C382S155000, C382S159000, C382S181000, C382S224000
Reexamination Certificate
active
10114419
ABSTRACT:
A Lagrangian support vector machine solves problems having massive data sets (e.g., millions of sample points) by defining an input matrix representing a set of data having an input space with a dimension of n that corresponds to a number of features associated with the data set, generating a support vector machine to solve a system of linear equations corresponding to the input matrix with the system of linear equations defined by a positive definite matrix, and calculating a separating surface with the support vector machine to divide the set of data into two subsets of data
REFERENCES:
patent: 6112195 (2000-08-01), Burges
patent: 6134344 (2000-10-01), Burges
patent: 6327581 (2001-12-01), Platt
patent: 6571225 (2003-05-01), Oles et al.
patent: 6728690 (2004-04-01), Meek et al.
patent: 2002/0165854 (2002-11-01), Blayvas et al.
patent: 2003/0115030 (2003-06-01), Ewing
patent: 2003/0167135 (2003-09-01), Ewing
patent: 2005/0105794 (2005-05-01), Fung
patent: 2005/0119837 (2005-06-01), Prakash et al.
patent: 2005/0171923 (2005-08-01), Kiiveri et al.
“Generalized Support Vector Machines”, O. L. Mangasarian, Technical Report 98-14, Computer Sciences Department, University of Wisconsin, Madison, Wisconsin, Oct. 1998.
“A tutorial on Support Vector Machines for Pattern Recognition”, Christopher J. C. Burges, Data Mining and Knowledge Discovery, 2, 121-167 (1998), Kluwer Academic Publishers, Boston, Manufactured in The Netherlands.
“Data Discrimination via Nonlinear Generalized Support Vector Machines”, O. L. Mangasarian and David R. Musicant, Technical Report 99-03, Computer Sciences Department, University of Wisconsin, Madison, Wisconsin, Mar. 1999.
“Successive Overrelaxation for Support Vector Machines”, O. L. Mangasarian and David R. Musicant, IEEE Transactions On Neural Networks, vol. 10, No. 5, Sep. 1999.
“RSVM: Reduced Support Vector Machines”, Yuh-Jye Lee and Olvi L. Mangasarian, Data Mining Institute Technical Report 00-07, Jul. 2000.
“Molecular Classification of Human Carcinomas by Use of Gene Expression Signatures”, Cancer Research 61, 7388-7393, Oct. 15, 2001.
“Lagrangian Support Vector Machines” (2000), O. L. Mangasarian, David R. Musicant, Technical report 00-06, Data Mining Institute, Computer Sciences Department, University of Wisconsin, Madison, Wisconsin, USA (Jun. 2000).
“A Tutorial on Support Vector Machines for Pattern Recognition”, Christopher J.C. Burges, Data Mining & Knowledge Discovery, 2, 121-167 (1998).
W. H. Wolberg an O.L. Mangasarian, “Multisurface Method of Pattern Separation for Medical Diagnosis Applied to Breast Cytology,” Proc. Natl. Acad. Sci. USA, vol. 87, pp. 9193-9196, Dec. 1990.
G. Fung and O.L. Mangasarian, “Data Selection for Support Vector Machine Classifiers,” Data Mining Institute Technical Report 00-02, Feb. 2000. Proceedings KDD-2000, Aug. 20-23, 2000, Boston. Association for Computing Machinery, New York, 2000, pp. 64-70. ISBN 1-58113-233-6.
Y. -J. Lee and O.L. Mangasarian, “SSVM: A Smooth Support Vector Machine for Classification,” Data Mining Institute Technical Report 99-03, Sep. 1999. Computational Optimization and Applications 20, pp. 1-22, 2001.
Y. -J. Lee, O.L. Mangasarian and W.H. Wolberg, “Breast Cancer Survival and Chemotherapy: A Support Vector Machine Analysis,” Data Mining Institute Technical Report 99-10, Dec. 1999, DIMACS Series in Discrete Mathematics and Computer Science, vol. 55, American Mathematical Society, pp. 1-10, 2000.
N. Cristianini and John Shawe-Taylor, An Introduction to Support Vector Machines and other Kernel-Based Learning Methods, Cambridge University Press, 2000, ISBN: 0 521 78019 5, 10 pgs.
P.S. Bradley and O.L. Mangasarian, “Feature Selection Via Concave Minimization and Support Vector Machines,” Machine Learning Proceedings of the Fifteenth International Conference (ICML '98), Madison, Wisconsin, pp. 82-90, Jul. 24-27, 1998.
G. Fung and O.L. Mangasarian, “Finite Newton Method for Lagrangian Support Vector Machine Classification,” Data Mining Institute Technical Report 02-01, pp. 1-22, Feb. 2002.
G. H. Golub and C. F. Van Loan, “Matrix Computations,” The John Hopkins University Press, Baltimore, Maryland, 3rdedition, pp. 48-86, 1996.
O.L. Mangasarian, “Parallel Gradient Distribution in Unconstrained Optimization,” SIAM Journal on Control and Optimization, 33(6), pp. 1916-1925, Nov. 1995.
S. C. Odewahn et al., “Automated Star/Galaxy Discrimination with Neural Networks,” The Astronomical Journal, 103(1), pp. 318-331, Jan. 1992.
M. Trotter, “Support Vector Machines for QSAR Analysis,” Department of Computer Science, University College London, undated, 25 pgs.
F. Facchinei, “Minimization of SC Functions and the Maratos Effect,” Operations Research Letters, vol. 17, pp. 131-137, 1995.
J. -B. Hiriart-Urruty et al., Generalized Hessian Matrix and Second-Order Optimality Conditions for Probelms with C1, 1 Data, Applied Mathematics & Optimization, vol. 11, pp. 43-56, Feb. 1984.
ILOG CPLEX, http://www.ilog.com/products/cplex, 1 page (last printed Feb. 10, 2004).
S. Lucidi, “A New Result in the Theory and Computation of the Least-Norm Solution of a Linear Program,” Journal of Optimization Therory and Applictions, vol. 55, pp. 103-117, Oct. 1987.
O. L. Mangasarian, “Normal Solutions of Linear Programs,” Mathematical Programming Study, vol. 22, pp. 206-216, Dec. 1984.
O. L. Mangasarian, “Arbitrary-Norm Separting Plane,” Operations Research Letters, vol. 24, No. 1-2, pp. 15-23, Feb. - Mar. 1999.
O. L. Mangasarian, “A Finite Newton Method for Classification Problems,” Data Mining Institute Technical Report 01-11, Computer Sciences Department, University of Wisconsin, Dec. 2001, pp. 1-17.
O. L. Mangasarian and R. R. Meyer, “Nonlinear Perturbation of Linear Programs,” SIAM Journal on Control and Optimization, 17(6), pp. 745-752, Nov. 1979.
O. L. Mangasarian, “A Newton Method for Linear Programming,” PowerPoint Presentation, Mathematics Department, University of California at San Diego, Jul. 26, 2002, 21 pgs.
O. L. Mangasarian, “A Newton Method for Linear Programming,” Data Mining Institute Technical Report 02-02, Computer Sciences Department, University of Wisconsin, pp. 1-20, March 2002.
N. E. Ayat et al., “Empirical Error Based Optimization of SVM Kernels: Application to Digit Image Recognition,” Proceedings of the Eighth International Workshop on Frontiers in Handwriting Recognition (IWFHR '02), 6 pgs., 2002.
Office Action dated Aug. 3, 2007 for U.S. Appl. No. 10/650,121 filed Aug. 28, 2003, (11 pages).
Barlogie, B., Cussens, J., Hardin, J., Page, D., Shaughnessy, J., Waddell, M., and Zhan, F., “Comparative data mining for microarrays: A case study based on multiple myeloma,” Technical Report 1453, Computer Sciences Department, University of Wisconsin, Madison, 22 pages, Nov. 2002.
Mangasarian Olvi L.
Musicant David R.
Shumaker & Sieffert, P. A.
Tran Mai T.
Vincent David
Wisconsin Alumni Research Foundation
LandOfFree
Lagrangian support vector machine does not yet have a rating. At this time, there are no reviews or comments for this patent.
If you have personal experience with Lagrangian support vector machine, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Lagrangian support vector machine will most certainly appreciate the feedback.
Profile ID: LFUS-PAI-O-3909787