Training machine learning by sequential conditional...

Data processing: speech signal processing – linguistics – language – Linguistics – Natural language

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C704S240000, C704S243000, C704S257000

Reexamination Certificate

active

11465102

ABSTRACT:
A system and method facilitating training machine learning systems utilizing sequential conditional generalized iterative scaling is provided. The invention includes an expected value update component that modifies an expected value based, at least in part, upon a feature function of an input vector and an output value, a sum of lambda variable and a normalization variable. The invention further includes an error calculator that calculates an error based, at least in part, upon the expected value and an observed value. The invention also includes a parameter update component that modifies a trainable parameter based, at least in part, upon the error. A variable update component that updates at least one of the sum of lambda variable and the normalization variable based, at least in part, upon the error is also provided.

REFERENCES:
patent: 5510981 (1996-04-01), Berger et al.
patent: 5640487 (1997-06-01), Lau et al.
patent: 6304841 (2001-10-01), Berger et al.
patent: 6314399 (2001-11-01), Deligne et al.
patent: 6697769 (2004-02-01), Goodman et al.
patent: 2001/0003174 (2001-06-01), Peters
“Including Features of Random Fields”; Stephen Della Pietra, et al. IEEE Transactions Pattern Analysis and Machine Intelligence, vol. 19, No. 4, Apr. 1997, pp. 1-13.
“A Maximum Entropy Approach to Natural Language Processing”; Adam L. Berger, et al.; IBM T.J. Watson Research Center; Computational Linguistics, vol. 22, No. 1; pp. 1-36, 1996.
“Classes for Fast Maximum Entropy Training”; Joshua Goodman; Microsoft Research, 2001.
Logistic Regression, AdaBoost and Bregman Distances; Michael Collins, et al.; Proceedings of the Thirteenth Annual Conference on Computational Learning Theory, 2000, Oct. 11, 2000; pp. 1-26.
Frederick Jelinek; “Statistical Methods for Speech Recognition”; pp. 1-283, 1998.
M. Branko, et al.; “Mitigating the Paucity of Data Problem: Exploring the Effect of Training Corpus Size on Classifier Performance for Natural Language Processing”; 2001; 5 pages.
David T. Brown; “A Note on Approximations to Discrete Probability Distributions”; Information and Control 2, 386-392 (1959).
Stanley F. Chen; “A Gaussian Prior for Smoothing Maximum Entropy Models”; Technical Report CMU-CS-99-108; Computer Science Department, Camegie Mellon University; Feb. 1999; pp. 1-23.
J. N. Darroch, et al.; “Generalized Iterative Scaling for Log-Linear Models”; The Annals of Mathematical Statistics, 43, No. 5; pp. 1470-1480, 1972.
John Lafferty, et al.; “Conditional Random Fields: Probabilistic Models for Segmenting and Labeling Sequence Data”; School of Computer Science, Carnegie Mellon University; 8 pages, 2001.
John D. Lafferty; “Gibbs-Markov Models”; In Computing Science and Statistics: Proceedings of the 27th Symposium on the Interface; 8 pages, 1996.
Thomas P. Minka; “Algorithms for Maximum-Likelihood Logistic Regression”; Oct. 2001; Availabe from http://www-white.media.mit.edu/tpminka/papers/learning.html, April; pp. 1-15, 2001.
Adwait Ratnaparkhi; “Maximum Entropy Models for Natural Language Ambiguity Resolution”; Ph.D. Thesis, University of Pennsylvania; 1998;pp. 1-147.
Jefferey C. Reynar, et al.; “A Maximum Entropy Approach to Identifying Sentence Boundaries”; Department of Computer and Information Science; University of Pennsylvania; 1997; 4 pages.
Ronald Rosenfeld; “Adaptive Statistical Language Modeling: A Maximum Entropy Approach”; Apr. 19, 1994; Ph.D. Thesis; Camegie Mellon University.
Joshua Goodman, Sequential Conditional Generalized Iterative Scaling, Proceedings of the 40th Annual Meeting of the ACL, Jul. 2002, pp. 9-16.
Jun Wu, et al. :“Efficient Training Methods for Maximum Entropy Language Modeling”; 2000, In IC-SLP, vol. 3, pp. 114-117.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Training machine learning by sequential conditional... does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Training machine learning by sequential conditional..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Training machine learning by sequential conditional... will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-3720967

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.