Language input architecture for converting one text form to...

Data processing: speech signal processing – linguistics – language – Linguistics – Translation machine

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C704S003000, C704S009000, C704S008000, C704S010000

Reexamination Certificate

active

09606807

ABSTRACT:
A language input architecture converts input strings of phonetic text (e.g., Chinese Pinyin) to an output string of language text (e.g., Chinese Hanzi) in a manner that minimizes typographical errors and conversion errors that occur during conversion from the phonetic text to the language text. The language input architecture has a search engine, one or more typing models, a language model, and one or more lexicons for different languages. Each typing model is trained on real data, and learns probabilities of typing errors. The typing model is configured to generate a list of probable typing candidates that may be substituted for the input string based on probabilities of how likely each of the candidate strings was incorrectly entered as the input string.

REFERENCES:
patent: 4833610 (1989-05-01), Zamora et al.
patent: 5175803 (1992-12-01), Yeh
patent: 5214583 (1993-05-01), Miike et al.
patent: 5218536 (1993-06-01), McWherter
patent: 5258909 (1993-11-01), Damerau et al.
patent: 5278943 (1994-01-01), Gasper et al.
patent: 5319552 (1994-06-01), Zhong
patent: 5510998 (1996-04-01), Woodruff et al.
patent: 5535119 (1996-07-01), Ito et al.
patent: 5572423 (1996-11-01), Church
patent: 5594642 (1997-01-01), Collins et al.
patent: 5646840 (1997-07-01), Yamauchi et al.
patent: 5671426 (1997-09-01), Armstrong, III
patent: 5704007 (1997-12-01), Cecys
patent: 5715469 (1998-02-01), Arning
patent: 5732276 (1998-03-01), Komatsu et al.
patent: 5781884 (1998-07-01), Pereira et al.
patent: 5806021 (1998-09-01), Chen et al.
patent: 5835924 (1998-11-01), Maruyama et al.
patent: 5893133 (1999-04-01), Chen
patent: 5907705 (1999-05-01), Carter
patent: 5930755 (1999-07-01), Cecys
patent: 5933525 (1999-08-01), Makhoul et al.
patent: 5974371 (1999-10-01), Hirai et al.
patent: 5974413 (1999-10-01), Geauregard et al.
patent: 5987403 (1999-11-01), Sugimura
patent: 6047300 (2000-04-01), Walfish et al.
patent: 6073146 (2000-06-01), Chen
patent: 6131102 (2000-10-01), Potter
patent: 6148285 (2000-11-01), Busardo
patent: 6154758 (2000-11-01), Chiang
patent: 6173252 (2001-01-01), Qiu et al.
patent: 6246976 (2001-06-01), Mukaigawa et al.
patent: 6256630 (2001-07-01), Gilai et al.
patent: 6356866 (2002-03-01), Pratley et al.
patent: 6374210 (2002-04-01), Chu
patent: 6487533 (2002-11-01), Hyde-Thomson et al.
patent: 6490563 (2002-12-01), Hon et al.
patent: 6573844 (2003-06-01), Venolia et al.
patent: 6646572 (2003-11-01), Brand
patent: 0 555 545 (1992-12-01), None
patent: 1158776 (1984-02-01), None
patent: 2158776 (1985-11-01), None
patent: 2248328 (1991-09-01), None
patent: WO 95/17729 (1995-06-01), None
patent: WO 00/10101 (2000-02-01), None
Mei Yuan et al., “A Neural Network for Disambiguating Pinyin Chinese Input,”Proc. Of the Computer Assisted Language Instruction Consortium 94 Annual Symposium, Mar. 14-18, 1994, pp. 239-243, XP-001020457, section: Introduction.
Camarda et al., “Entering Asian Text with Input Method Editors,”Using Word 2000, Chapter 26.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Language input architecture for converting one text form to... does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Language input architecture for converting one text form to..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Language input architecture for converting one text form to... will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-3780325

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.