Transparent monitoring and intervention to improve automatic...

Data processing: speech signal processing – linguistics – language – Speech signal processing – Recognition

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

Reexamination Certificate

active

07660715

ABSTRACT:
A system and method to improve the automatic adaptation of one or more speech models in automatic speech recognition systems. After a dialog begins, for example, the dialog asks the customer to provide spoken input and it is recorded. If the speech recognizer determines it may not have correctly transcribed the verbal response, i.e., voice input, the invention uses monitoring and if necessary, intervention to guarantee that the next transcription of the verbal response is correct. The dialog asks the customer to repeat his verbal response, which is recorded and a transcription of the input is sent to a human monitor, i.e., agent or operator. If the transcription of the spoken input is correct, the human does not intervene and the transcription remains unmodified. If the transcription of the verbal response is incorrect, the human intervenes and the transcription of the misrecognized word is corrected. In both cases, the dialog asks the customer to confirm the unmodified and corrected transcription. If the customer confirms the unmodified or newly corrected transcription, the dialog continues and the customer does not hang up in frustration because most times only one misrecognition occurred. Finally, the invention uses the first and second customer recording of the misrecognized word or utterance along with the corrected or unmodified transcription to automatically adapt one or more speech models, which improves the performance of the speech recognition system.

REFERENCES:
patent: 4468804 (1984-08-01), Kates et al.
patent: 4696039 (1987-09-01), Doddington
patent: 4852170 (1989-07-01), Bordeaux
patent: 5018200 (1991-05-01), Ozawa
patent: 5206903 (1993-04-01), Kohler et al.
patent: 5583969 (1996-12-01), Yoshizumi et al.
patent: 5634086 (1997-05-01), Rtischev et al.
patent: 5644680 (1997-07-01), Bielby et al.
patent: 5684872 (1997-11-01), Flockhart et al.
patent: 5802149 (1998-09-01), Hanson
patent: 5828747 (1998-10-01), Fisher et al.
patent: 5905793 (1999-05-01), Flockhart et al.
patent: 5982873 (1999-11-01), Flockhart et al.
patent: 6064731 (2000-05-01), Flockhart et al.
patent: 6084954 (2000-07-01), Harless et al.
patent: 6088441 (2000-07-01), Flockhart et al.
patent: 6122614 (2000-09-01), Kahn et al.
patent: 6151571 (2000-11-01), Pertrushin
patent: 6163607 (2000-12-01), Bogart et al.
patent: 6173053 (2001-01-01), Bogart et al.
patent: 6178400 (2001-01-01), Eslambolchi
patent: 6192122 (2001-02-01), Flockhart et al.
patent: 6243680 (2001-06-01), Gupta et al.
patent: 6259969 (2001-07-01), Tackett et al.
patent: 6275806 (2001-08-01), Pertrushin
patent: 6275991 (2001-08-01), Erlin
patent: 6278777 (2001-08-01), Morley et al.
patent: 6292550 (2001-09-01), Burritt
patent: 6314165 (2001-11-01), Junqua et al.
patent: 6353810 (2002-03-01), Petrushin
patent: 6363346 (2002-03-01), Walters
patent: 6374221 (2002-04-01), Haimi-Cohen
patent: 6389132 (2002-05-01), Price
patent: 6408273 (2002-06-01), Quagliaro et al.
patent: 6427137 (2002-07-01), Petrushin
patent: 6463415 (2002-10-01), St. John
patent: 6480826 (2002-11-01), Pertrushin
patent: 6697457 (2004-02-01), Petrushin
patent: 6766014 (2004-07-01), Flockhart et al.
patent: 6801888 (2004-10-01), Hejna, Jr.
patent: 6823312 (2004-11-01), Mittal et al.
patent: 6839669 (2005-01-01), Gould et al.
patent: 6847714 (2005-01-01), Das et al.
patent: 6889186 (2005-05-01), Michaelis
patent: 6940951 (2005-09-01), Mahoney
patent: 6999563 (2006-02-01), Thorpe et al.
patent: 7065485 (2006-06-01), Chong-White et al.
patent: 7180997 (2007-02-01), Knappe
patent: 7222074 (2007-05-01), Zhou
patent: 7222075 (2007-05-01), Petrushin
patent: 7267652 (2007-09-01), Coyle et al.
patent: 2002/0019737 (2002-02-01), Stuart et al.
patent: 2003/0191639 (2003-10-01), Mazza
patent: 2004/0148161 (2004-07-01), Das et al.
patent: 2004/0215453 (2004-10-01), Orbach
patent: 2005/0065789 (2005-03-01), Yacoub et al.
patent: 2005/0094822 (2005-05-01), Swartz
patent: 2006/0036437 (2006-02-01), Bushey et al.
patent: 2006/0252376 (2006-11-01), Fok
patent: 2007/0038455 (2007-02-01), Murzina
patent: 1333425 (1994-12-01), None
patent: 0076687 (1983-04-01), None
patent: 0140249 (1985-05-01), None
patent: 0360265 (1990-03-01), None
patent: 10-124089 (1998-05-01), None
patent: WO 00/22611 (2000-04-01), None
patent: WO 2004/056086 (2004-07-01), None
U.S. Appl. No. 11/131,108, filed May 16, 2005, Michaelis.
U.S. Appl. No. 11/508,442, filed Aug. 22, 2006, Coughlan.
U.S. Appl. No. 11/508,477, filed Aug. 22, 2006, Michaelis.
U.S. Appl. No. 11/768,567, filed Jun. 26, 2007, Coughlan.
U.S. Appl. No. 11/388,694, filed Mar. 24, 2006, Blair et al.
U.S. Appl. No. 10/882,975, filed Jun. 30, 2004, Becker et al.
Arslan, Levent M., “Foreign Accent Classification in American English,” thesis, pp. 1-200, Department of Electrical Computer Engineering, Duke University, 1996.
Arslan, Levent M., et al., “Language Accent Classification in American English,” Robust Speech Processing Laboratory Department of Electrical Engineering, Durham, North Carolina, Technical Report RSPL-96-7(1996).
Hansen, John H.L., et al., “Foreign Accent Classification Using Source Generator Based Prosodic Features,” IEEE Proc. ICASSP, vol. 1, Detroit U.S.A., (1995), pp. 836-839.
Hosom, John-Paul, et al., “Training Neural Networks for Speech Recognition,” Center for Spoken Language Understanding, Oregon Graduate Institute of Science and Technology (Feb. 2, 1999), 51 pages.
Jackson, Philip J.B., et al., “Aero-Acoustic Modeling of Voiced and Unvoiced Fricatives Based on MRI Data,” University of Birmingham and University of Southampton, (undated), 4 pages.
Kirriemuri, John, “Speech Recognition Technologies,” TSW 03-03 (Mar. 2003), 13 pages.
Lamel, L.F., et al., “Language Identification Using Phone-based Acoustic Likelihoods,” ICASSP-94.
Loizou, Philip, “Speech Production and Perception,” EE 6362 Lecture Notes (Fall 2000), pp. 1-30.
Michaelis, “Speech Digitization and Compression”, In W. Warkowski (Ed.), International Encyclopedia of Ergonomics and Human Factors. London: Taylor Francis, 2001, 683-686.
Pervasive, Human-Centered Computing, MIT Project Oxygen, MIT Laboratory for Computer Science, Jun. 2000.
Zue, Victor, “The MIT Oxygen Project,” MIT Laboratory for Computer Science, Apr. 25-26, 2000.
Novak, D Cuesta-Frau, and L. Lhotska: Speech recognition methods applied to biomedical signals processing. Engineering in Medicine and Biology Society. 2004; 1: 118-121.
Entwistle, The performance of automated speech recognition systems under adverse conditions of human exertion. Int. J. Hum.-Comput. Interact. 16 (2003) (2), pp. 127-140.
Entwistle, “Training Methods and Enrollment Techniques to Improve the Performance of Automated Speech Recognition Systems Under Conditions of Human Exertion”, A Dissertation Submitted in Partial Fulfillment of The Requirements for the Degree of Doctor of Philosophy, University of South Dakota, Jul. 2005.
Landauer et al., “An Introduction to Latent Semantic Analysis”, Discourse Processes, 1998, 41 pages.
Lin et al., “Phoneme-less Hierarchical Accent Classification”, HP Laboratories Palo Alto, Oct. 4, 2004, 5 pages.
Background of The Invention for the above-captioned application (previously provided).

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Transparent monitoring and intervention to improve automatic... does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Transparent monitoring and intervention to improve automatic..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Transparent monitoring and intervention to improve automatic... will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-4163843

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.