Data processing: structural design – modeling – simulation – and em – Modeling by mathematical expression
Reexamination Certificate
2005-04-12
2005-04-12
Phan, Thai (Department: 2128)
Data processing: structural design, modeling, simulation, and em
Modeling by mathematical expression
C706S012000, C706S016000, C706S025000
Reexamination Certificate
active
06879944
ABSTRACT:
A variational Relevance Vector Machine (RVM) is disclosed. The RVM is a probabilistic basis model. Sparsity is achieved through a Bayesian treatment, where a prior is introduced over the weights governed by a set of what are referred to as hyperparameters—one such hyperparameter associated with each weight. An approximation to the joint posterior distribution over weights and hyperparameters is iteratively estimated from the data. The posterior distribution of many of the weights is sharply peaked around zero, in practice. The variational RVM utilizes a variational approach to solve the model, in particular using product approximations to obtain the posterior distribution.
REFERENCES:
patent: 5684929 (1997-11-01), Cortes et al.
patent: 5692107 (1997-11-01), Simoudis et al.
patent: 5720003 (1998-02-01), Chiang et al.
patent: 5855011 (1998-12-01), Tatsuoka
patent: 6556960 (2003-04-01), Bishop et al.
patent: 6633857 (2003-10-01), Tipping
patent: 6671661 (2003-12-01), Bishop
Matsumoto et al., “Reconstruction and Prediction of Nonlinear Dynamical System: A Hierarchical Bayes Approach with Neural Nets”, Proceedings of 1999 IEEE International Conference on Acoustics, Speech, and Signal Processing, 1999, vol. 2, pp. 1061-1064.*
Sollich, P., “Probabilistic interpretations and Bayesian methods for support vector machines,” Ninth International Conference on Artificial Neural Networks, ICANN 99. Conf. Publ. No. 470, vol.: 1, page(s): 91-96.*
Wahba, G.; Yi Lin; Hao Zhang , “Margin-like quantities and generalized approximate cross validation for support vector machines,” Proceedings of the 1999 IEEE Signal Processing Society Workshop Neural Networks for Signal Processing IX, Aug. '99, Pp: 12-20.*
Hearst, M.A.; Dumais, S.T.; Osman, E.; Platt, J.; Scholkopf, B, “Support vector machines,” Intelligent Systems, IEEE, vol.: 13 Issue: 4, Jul./Aug. 1998, page(s): 18-28.*
Vladimir N. Vapnik, Statistical Learning Theory, Chapter 10: The Support Vector Method for Estimating Indicator Functions, 1998, John Wiley & Sons, Inc., ISBN 0-471-03003-1.
MacKay, Bayesian non-linear modelling for the prediction competition, in ASHRAE Transactions, vol. 100, pp. 1053-1062, ASHRAE, Atlanta, Georgia, 1994.
MacKay, Bayesian Interpolation, Neural Computation, 4(3): 415-447, 1992.
MacKay, The evidence framework applied to classification networks, Neural Computation, 4(5):720-736, 1992.
Neal, Lecture Notes in Statistics 118, Bayesian Learning for Neural Networks, pp. 15-17, 100-102, 113-116, 147-150 (Springer, 1996).
Platt, Fast training of support vector machines using sequential minimal optimization, in Advances in Kernal Methods: Support Vector Learning, MIT Press, Cambridge, MA (1999).
Jaakkola and Jordan, Bayesian parameter estimation through variational methods, Jan. 21, 1998, to appear in Statistics and Computing.
Bishop Christopher
Tipping Michael
Day Herng-der
Law Offices of Albert S. Michalik PLLC
Microsoft Corporation
Phan Thai
LandOfFree
Variational relevance vector machine does not yet have a rating. At this time, there are no reviews or comments for this patent.
If you have personal experience with Variational relevance vector machine, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Variational relevance vector machine will most certainly appreciate the feedback.
Profile ID: LFUS-PAI-O-3405171