Spread kernel support vector machine

Data processing: artificial intelligence – Neural network

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

Reexamination Certificate

active

07406450

ABSTRACT:
Disclosed is a parallel support vector machine technique for solving problems with a large set of training data where the kernel computation, as well as the kernel cache and the training data, are spread over a number of distributed machines or processors. A plurality of processing nodes are used to train a support vector machine based on a set of training data. Each of the processing nodes selects a local working set of training data based on data local to the processing node, for example a local subset of gradients. Each node transmits selected data related to the working set (e.g., gradients having a maximum value) and receives an identification of a global working set of training data. The processing node optimizes the global working set of training data and updates a portion of the gradients of the global working set of training data. The updating of a portion of the gradients may include generating a portion of a kernel matrix. These steps are repeated until a convergence condition is met. Each of the local processing nodes may store all, or only a portion of, the training data. While the steps of optimizing the global working set of training data, and updating a portion of the gradients of the global working set, are performed in each of the local processing nodes, the function of generating a global working set of training data is performed in a centralized fashion based on the selected data (e.g., gradients of the local working set) received from the individual processing nodes.

REFERENCES:
patent: 5640492 (1997-06-01), Cortes et al.
patent: 5649068 (1997-07-01), Boser
patent: 5950146 (1999-09-01), Vapnik
patent: 6128608 (2000-10-01), Barnhill
patent: 6134344 (2000-10-01), Burges
patent: 6157921 (2000-12-01), Barnhill
patent: 6192360 (2001-02-01), Dumais et al.
patent: 6269323 (2001-07-01), Vapnik
patent: 6327581 (2001-12-01), Platt
patent: 6427141 (2002-07-01), Barnhill
patent: 6456991 (2002-09-01), Srinivasa et al.
patent: 6633857 (2003-10-01), Tipping
patent: 6658395 (2003-12-01), Barnhill
patent: 6714925 (2004-03-01), Barnhill et al.
patent: 6728690 (2004-04-01), Meek et al.
patent: 6757584 (2004-06-01), Thess et al.
patent: 6760715 (2004-07-01), Barnhill et al.
patent: 6789069 (2004-09-01), Barnhill et al.
patent: 6882990 (2005-04-01), Barnhill et al.
patent: 6944602 (2005-09-01), Cristianini
patent: 6944616 (2005-09-01), Ferguson et al.
patent: 6996549 (2006-02-01), Zhang et al.
patent: 7035467 (2006-04-01), Nicponski
patent: 7054847 (2006-05-01), Hartman et al.
patent: 7117188 (2006-10-01), Guyon et al.
patent: 7299213 (2007-11-01), Cristianini
patent: 7318051 (2008-01-01), Weston et al.
patent: 7353215 (2008-04-01), Bartlett et al.
patent: 7363111 (2008-04-01), Vian et al.
F-SVR: A new learning algorithm for support vector regression Tohme, Mireille; Lengelle, Regis; Acoustics, Speech and Signal Processing, 2008. ICASSP 2008. IEEE International Conference on Mar. 31, 2008-Apr. 4, 2008 pp. 2005-2008 Digital Object Identifier 10.1109/ICASSP.2008.4518032.
A Pyramidal Neural Network For Visual Pattern Recognition Phung, S.L.; Bouzerdoum, A.; Neural Networks, IEEE Transactions on vol. 18, Issue 2, Mar. 2007 pp. 329-343 Digital Object Identifier 10.1109/TNN.2006.884677.
Combining Gradient and Evolutionary Approaches to the Artificial Neural Networks Training According to Principles of Support Vector Machines Bundzel, M.; Sincak, P.; Neural Networks, 2006. IJCNN '06. International Joint Conference on Jul. 16-21, 2006 pp. 2068-2074.
Comparison of SVM and ANN performance for handwritten character classification Kahraman, F.; Capar, A.; Ayvaci, A.; Demirel, H.; Gokmen, M.; Signal Processing and Communications Applications Conference, 2004. Proceedings of the IEEE 12thApr. 28-30, 2004 pp. 615-618 Digital Object Identifier 10.1109/SIU.2004.1338604.
A Heuristic for Free Parameter Optimization with Support Vector Machines Boardman, M.; Trappenberg, T.; Neural Networks, 2006. IJCNN '06. International Joint Conference on 0-0 0 pp. 610-617 Digital Object Identifier 10.1109/IJCNN.2006.246739.
Modeling and Recognition of Gesture Signals in 2D Space: A Comparison of NN and SVM Approaches Dadgostar, F.; Farhad Dadgostar; Sarrafzadeh, A.; De Silva, L.; Messom, C.; Tools with Artificial Intelligence, 2006. ICTAI '06. 8th IEEE International Conference on Nov. 2006 pp. 701-704 Digital Object Identifier 10.1109/ICTAI.2006.85.
A spatiotemporal approach to tornado prediction Lakshmanan, V.; Adrianto, I.; Smith, T.; Stumpf, G.; Neural Networks, 2005. IJCNN '05. Proceedings. 2005 IEEE International Joint Conference on vol. 3, Jul. 31-Aug. 4, 2005 pp. 1642-1647 vol. 3 Digital Object Identifier 10.1109/IJCNN.2005.1556125.
Sparse training procedure for kernel neuron Jianhua Xu; Xuegong Zhang; Yanda Li; Neural Networks and Signal Processing, 2003. Proceedings of the 2003 International Conference on vol. 1, Dec. 14-17, 2003 pp. 49-53 vol. 1 Digital Object Identifier 10.1109/ICNNSP.2003.1279210.
Boser, B. et al., “A training algorithm for optimal margin classifiers” Proc. 5th Annual Workshop on Computational Learning Theory, Pittsburgh, ACM 1992.
Burges, C., “A tutorial on support vector machines for pattern recognition,”Data Mining and Knowledge Discovery 2, 121-167, 1998.
Chang, C-H et al., “LIBSVM: a library for support vector machines,” downloaded from http://www.csie.ntu.edu.tw/˜cjlin/papers/libsvm.pdf on Oct. 14, 2004.
Decoste, D. et al., “Traning invariant support vector machines,” Machine Learning, 46, 161-190, Khuwer Academic Publishers, 2002.
Zanghirati, G. et al., “A parallel solver for large quadratic programs in training support vector machines”, Parallel Computing 29 (2003) 535-551.
Dong, J. et al., “A fast Parallel Opt. for Trng. Support Vector Machine” Proc. of 3rd Conf. on Mach. Lrng. & Data Mining, LNAI 2734, pp. 96-105, Leipzig, Germany Jul. 5-7, 2003.
Collobert, R. et al., “Torch: A modular machine learning software library”, Technical Report IDIAP-RR 02-46, IDIAP, 2002.
Joachims, T., “Making large-scale support vector machine learning practical”, Advances in Kernel Methods, B. Scholkopf, et al. (eds.), Cambridge, MIT Press, 1998.
Osuna, E. et al., “Support vector machines: Training and Application”, MIT AI Memorandum 1602, Mar. 1997.
Platt, J.C., “Fast training of support vector machines using sequential minimal optimization”, in Adv. in Kernel Methods, Scholkopf, et al. (eds) 1998.
Tveit, T. et al., “Parallelization of the Incremental Proximal Support Vector Machine Classifier using a Heap-based Tree Topology”, Tech. Report, IDI, NTNU, Trondheim, 2003.
D'Apuzzo, M., et al., “Parallel Comp. Issues of an Interior Point Method for Solving Large Bound-Constrained Quadratic Program. Problems”, Parallel Comput., 29, 467-483, 2003.
Graf, H.P., et al., “Parallel Support Vector Machines: The Cascade SVM”, NIPS, vol. 17, MIT Press, 2005.
Collobert, R. et al., “Scaling Large Learning Problems with Hard Parallel Mixtures,” International Journal of Pattern Recognition and Artificial Intel. 2003.
Collobert, Ronan, et al., “A Parallel Mixture of SVMs for Very Large Scale Problems”, Neural Information Processing Systems, vol. 17, Dec. 31, 2004.
Smelyanskiy, Mikhail, et al., “Parallel Computing for Large-Scale Optimization Problems: Challenges and Solutions”, Intel Technical Journal, Compute-Intensive, Highly Parallel Applications and Uses, vol. 9, No. 2, May 19, 2005.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Spread kernel support vector machine does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Spread kernel support vector machine, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Spread kernel support vector machine will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-3964796

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.