Data processing: measuring – calibrating – or testing – Measurement system – Statistical measurement
Reexamination Certificate
2008-03-04
2008-03-04
Ramos-Feliciano, Eliseo (Department: 2857)
Data processing: measuring, calibrating, or testing
Measurement system
Statistical measurement
C702S182000, C702S188000, C702S189000, C709S206000
Reexamination Certificate
active
11186287
ABSTRACT:
The subject invention provides for systems and methods that facilitate optimizing one or mores sets of training data by utilizing an Exponential distribution as the prior on one or more parameters in connection with a maximum entropy (maxent) model to mitigate overfitting. Maxent is also known as logistic regression. More specifically, the systems and methods can facilitate optimizing probabilities that are assigned to the training data for later use in machine learning processes, for example. In practice, training data can be assigned their respective weights and then a probability distribution can be assigned to those weights.
REFERENCES:
patent: 6125362 (2000-09-01), Elworthy
patent: 6161130 (2000-12-01), Horvitz et al.
patent: 6304841 (2001-10-01), Berger et al.
patent: 6553358 (2003-04-01), Horvitz
patent: 6606620 (2003-08-01), Sundaresan et al.
patent: 6609094 (2003-08-01), Basu et al.
patent: 6697769 (2004-02-01), Goodman et al.
patent: 2003/0105638 (2003-06-01), Taira
patent: 2003/0126102 (2003-07-01), Borthwick
patent: 2004/0260922 (2004-12-01), Goodman et al.
patent: 2005/0096907 (2005-05-01), Bacchiani et al.
patent: 2006/0095521 (2006-05-01), Patinkin
T. Jaakkola, M. Meila, and T. Jebara. Maximum entropy discrimination. Advances in Neural Information Processing Systems 12. Cambridge, MA: MIT Press, 2000, pp. 470-477.
T. Pedersen, R. Bruce, and J. Wiebe. Sequential model selection for word sense disambiguation. Proceedings of the 1997 Conference on Applied Natural Language Processing. Washington, D.C., 1997, pp. 388-395.
R. Lau. Adaptive statistical language modelling. M.S. thesis, Department of Electrical Engineering and Computer Science, Massachusetts Institute of Technology, Cambridge, MA, 1994. 65 pages.
J. Goodman, Exponential Priors for Maximum Entropy Models, Technical Report, Microsoft Research, Jun. 2003, 14 pages.
J. Goodman, Exponential Priors for Maximum Entropy Models, North American ACL, 2004, 14 pages.
S.F. Chen and J. Goodman. An Empirical Study of Smoothing Techniques for Language Modeling. Computer Speech and Language, 13: 359-394, Oct. 1999.
A. Ratnaparkhi. Maximum Entropy Models for Natural Language Ambiguity Resolution. PhD Thesis, University of Pennsylvania, 1998. 163 pages.
J. Reynar and A. Ratnaparkhi. A Maximum Entropy Approach to Identifying Sentence Boundaries. In ANLP, 1997. 4 pages.
R. Rosenfeld. Adaptive Statistical Language Modeling: A Maximum Entropy Approach. PhD Thesis, Carnegie Mellon University, Apr. 1994. 114 pages.
S. Khudanpur. A Method of Maximum Entropy Estimation with Relaxed Constraints. In 1995 Johns Hopkins University Language Modeling Workshop, 1995. 18 pages.
P.M. Williams. Bayesian Regularization and Pruning using a Laplace Prior. Neural Computation, vol. 7, pp. 117-143, 1995.
M. Banko and E. Brill. Mitigating the Paucity of Date Problem: Exploring the Effect of Training Corpus Size on Classifier Performance for NLP. In Proc. of the Conference on Human Language Technology, 2001. 5 pages.
A.L. Berger, et al. A Maximum Entropy Approach to Natural Language Processing. Computational Linguistics, 22(1): 39-71, 1996.
S.F. Chen and R. Rosenfeld. A Survey of Smoothing Techniques for ME Models. IEEE Transactions on Speech and Audio Processing, vol. 8 No. 1, Jan. 2000. 14 pages.
S. Della Pietra, et al. Inducing Features of Random Fields. IEEE Transactions on Pattern Analysis and Machine Intelligence. 19(4): 380-393, 1997.
I.J. Good. The Population Frequencies of Species and the Estimation of Population Parameters. Biometrika, vol. 40 No. 3/4, pp. 237-264, 1953.
J. Goodman. Classes for Fast Maximum Entropy Transing. In ICASSP 2001. 4 pages.
C.M. Kadie, et al. CFW: A Collaborative Filtering System using Posteriors over Weights of Evidence. In Proc. of UAI, pp. 242-250, 2002.
R. Kneser and H. Ney. Improved Backing-off for M-gram Language Modeling. In ICASSP, vol. 1, pp. 181-184, 1995.
W. Newman. An Extension to the Maximum Entropy Method. IEEE Transactions on Information Theory, vol. IT-23, No. 1, Jan. 1997, 5 pages.
J. Darroch and D. Ratcliff. Generalized Iterative Scaling for Log-linear Models. The Annals of Mathematical Statistics, 43: 1470-1480, 1972.
J. Breese, et al., Empirical Analysis of Predictive Algorithms for Collaborative Filtering, in Proceedings of the 14th Conference on Uncertainty in Artificial Intelligence, 1998, pp. 43-52, AUAI, Morgan Kaufmann, San Francisco.
M. Czerwinski, et al., Visualizing Implicit Queries for Information Management and Retrieval, in Proceedings of CHI'99, ACM SIGCHI Conference on Human Factors in Computing Systems, 1999, pp. 560-567, Ass'n for Computing Machinery, Pittsburgh, PA.
S. Dumais, et al., Inductive Learning Algorithms and Representations for Text Categorization, in Proceedings of the 7th Internat'l. Conference on Information and Knowledge Mgmt., 1998, pp. 148-155, Ass'n. for Computing Machinery, ACM Press, NY.
E. Horvitz, Principles of Mixed-Initiative User Interfaces, in Proceedings of CHI'99, ACM SIGCH Conference on Human Factors in Computing Systems, 1999, pp. 159-166, Ass'n for Computing Machinery, Pittsburgh, PA.
E. Horvitz, et al., Display of Information for Time-Critical Decision Making, in Proceedings of the 11th Conference on Uncertainty in Artificial Intelligence, 1995, pp. 296-305, Montreal, Canada, Morgan Kaufmann, San Francisco.
E. Horvitz, et al., The Lumiere Project: Bayesian Use Modeling for Inferring the Goals and Needs of Software Users, in Proceedings of the 14th Conference on Uncertaintly in Artificial Intelligence, 1998, pp. 256-265, Morgan Kaufmann, San Francisco.
E. Horvitz, et al., Time-Dependent Utility and Action Under Uncertainty, in Proceedings of the 7th Conference on Uncertainty in Artificial Intelligence, 1991, pp. 151-158, Morgan Kaufmann, San Francisco.
E. Horvitz, et al., Time-Critical Action: Representations and Applications, in Proceedings of the 13th Conference on Uncertainty in Artificial Intelligence (UAI'97), 1997, pp. 250-257, Providence, RI, Morgan Kaufmann, San Francisco.
D. Koller, et al., Toward Optimal Feature Selection, in Proceedings of the 13th Conference on Machine Learning, 1996, pp. 284-292, Morgan Kaufmann, San Francisco.
H. Lieberman, An Agent That Assist Web Browsing, in Proceedings of IJCAI-95, 1995, Montreal, Canada, Morgan Kaufmann, San Francisco.
Chen and Rosenfeld. A Gaussian Prior for Smoothing Maximum Entropy Models. Feb. 1999.
Amin Turocy & Calvin LLP
Huynh Phuong
Microsoft Corporation
Ramos-Feliciano Eliseo
LandOfFree
Exponential priors for maximum entropy models does not yet have a rating. At this time, there are no reviews or comments for this patent.
If you have personal experience with Exponential priors for maximum entropy models, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Exponential priors for maximum entropy models will most certainly appreciate the feedback.
Profile ID: LFUS-PAI-O-3939724