Method and apparatus for adaptively generating field of applicat

Boots – shoes – and leggings

Patent

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

36441908, 395 247, G06F 1538

Patent

active

054446171

ABSTRACT:
A system architecture for providing human intelligible information by processing a flow of input data; e.g., converting speech (source information) into printable data (target information) based on target-dependent probabilistic models; and for enabling efficient switching from one target field of information into another. To that end, the system is provided with a language modeling device including a data base loadable with an application-dependent corpus of words and/or symbols through a workstation; and a language modeling processor programmed to refresh, in practice, a tree-organized model, efficiently, with no blocking situations, and at a reasonable cost.

REFERENCES:
patent: 4942526 (1990-07-01), Okajima et al.
patent: 5005203 (1991-04-01), Ney
patent: 5195167 (1993-03-01), Bahl et al.
patent: 5267165 (1993-11-01), Sirat
Speech Technology, vol. 5, No. 3, Feb. 1991, New York US pp. 96-100, Meisel et al "Efficient Representation of Speech for Recognition", p. 97, left col., paragrah 2--p. 99, left col., paragraph 1; figures 2, 3.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Method and apparatus for adaptively generating field of applicat does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Method and apparatus for adaptively generating field of applicat, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Method and apparatus for adaptively generating field of applicat will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-2146454

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.