Partially supervised machine learning of data classification...

Data processing: artificial intelligence – Machine learning

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C382S225000, C382S159000

Reexamination Certificate

active

07412425

ABSTRACT:
A local-neighborhood Laplacian Eigenmap (LNLE) algorithm is provided for methods and systems for semi-supervised learning on manifolds of data points in a high-dimensional space. In one embodiment, an LNLE based method includes building an adjacency graph over a dataset of labelled and unlabelled points. The adjacency graph is then used for finding a set of local neighbors with respect to an unlabelled data point to be classified. An eigen decomposition of the local subgraph provides a smooth function over the subgraph. The smooth function can be evaluated and based on the function evaluation the unclassified data point can be labelled. In one embodiment, a transductive inference (TI) algorithmic approach is provided. In another embodiment, a semi-supervised inductive inference (SSII) algorithmic approach is provided for classification of subsequent data points. A confidence determination can be provided based on a number of labeled data points within the local neighborhood. Experimental results comparing LNLE and simple LE approaches are presented.

REFERENCES:
Belkin, “Laplacian Eigenmaps for Dimensionality Reduction and Data Representation”, 2001.
He et. al., “Face Recognition Using Laplacianfaces”, 2005.
Lee et. al. (Lee), “Video-Based Face Recognition Using Probabilistic Appearance Manifolds”, 2003.
Belkin, “Problems of Learning on Manifolds”, 2003.
Aggarwal, C.C. et al., “On the Surprising Behavior of Distance Metrics in High Dimensional Space,” Lecture Notes in Computer Science, 1973, 15 pages.
Beer, I. et al., “RuleBase: an Industry-Oriented Formal Verification Tool,” 33rdDesign Automation Conference, ACM, Inc., 1996, 6 pages.
Belkin, M. et al., “Laplacian Eigenmaps for Dimensionality Reduction and Data Representation,” Dec. 8, 2002, pp. 1-28.
Belkin, M. et al., “Semi-Supervised Learning on Manifolds,” Submitted to Journal of Machine Learning Research, Nov. 29, 2002, pp. 1-23.
Belkin, M. et al., “Semi-Supervised Learning on Reimannian Manifolds,” Machine Learning, 2004, pp. 209-239, vol. 56.
Bengio, Y. et al., “Learning Eigenfunctions of Similarity: Linking Spectral Clustering and Kernel PCA,” Technical Report 1232, Feb. 28, 2003, pp. 1-27.
Bengio, Y. et al., “Out-of-Sample Extensions for LLE, Isomap, MDS, Eigenmaps, and Spectral Clustering,” Advances in Neural Information Processing Systems, Jul. 25, 2003, pp. 1-10.
Bengio, Y. et al., “Spectral Clustering and Kernel PCA are Learning Eigenfunctions,” Technical Report 1239, Jul. 25, 2003, pp. 1-9.
Castelli, V. et al., “The Relative Value of Labeled and Unlabeled Samples in Pattern Recognition with an Unknown Mixing Parameter,” IEEE Transactions on Information Theory, Nov. 1996, pp. 2102-2117, vol. 42, No. 6.
Chung,Spectral Graph Theory, CBMS Regional Conference Series in Mathematics, No. 92, Jun. 6-10, 1994.
Cox, T.F. et al.,Multidimensional Scaling, 1994.
Dijkstra, E. W., “A Note on Two Problems in Connexion with Graphs,” Numerische Mathematik, 1959, pp. 269-271, vol. 1.
Friedman, J. L. et al., “An Algorithm for Finding Best Matches in Logarithmic Expected Time,” ACM Transactions on Mathematical Software, Sep. 1977, pp. 209-226, vol. 3, No. 3.
Golub, G.H. et al.,Matrix Computations, 1983.
Kemp, C. et al., “Semi-Supervised Learning with Trees,” Advances in neural Information Processing Systems, 2003, 8 pages.
Lecun, Y. et al., “The MNIST Database of Handwritten Digits,” [online] [Retrieved on Mar. 10, 2006] Retrieved on the internet<URL:http://yann.lecun.com/exdb/mnist/index.html>.
Lehouq, R.B. et al.,Arpack User's Guide, 1998, Society for Industrial and Applied Mathematics.
Lehouq, R.B. et al., “Deflation Techniques for an Implicitly Restarted Arnoldi Iteration,” SIAM Journal on Matrix Analysis and Applications, 1996, pp. 789-821, vol. 17.
MATLAB code, [online] [Retrieved on Mar. 10, 2006] Retrieved from the Internet<URL:http://people.cs.uchicago.edu/˜misha/ManifoldLearning/MATLAB/Laplacian.tar>.
Ng, A.Y. et al., “On Spectral Clustering: Analysis and an Algorithm,” Advances in Neural Information Processing Systems, 2002, 8 pages, MIT Press.
Omohundro, S.M., “Bumptrees for Efficient Function, Constraint, and Classification Learning,” Technical Report 91-009, 7 pages, International Computer Science Institute, Berkeley, no date.
Rifkin, R. et al., “Local-Neighborhood Laplacian Eigenmaps,” 8 pages, no date.
Roweis, S.T. et al., “Nonlinear Dimensionality Reduction by Locally Linear Embedding,” Science, Dec. 22, 2000, pp. 2323-2326, vol. 290.
Tenenbaum, J.B. et al., “A Global Geometric Framework for Nonlinear Dimensionality Reduction,” Science, Dec. 22, 2000, pp. 2319-2323, vol. 290.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Partially supervised machine learning of data classification... does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Partially supervised machine learning of data classification..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Partially supervised machine learning of data classification... will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-3998026

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.