System, method, and computer-readable medium for verbal...

Data processing: speech signal processing – linguistics – language – Speech signal processing – Recognition

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C704S251000, C704S275000, C370S260000, C379S088020, C379S158000

Reexamination Certificate

active

08060366

ABSTRACT:
A system, method, and computer readable medium that facilitate verbal control of conference call features are provided. Automatic speech recognition functionality is deployed in a conferencing platform. Hot words are configured in the conference platform that may be identified in speech supplied to a conference call. Upon recognition of a hot word, a corresponding feature may be invoked. A speaker may be identified using speaker identification technologies. Identification of the speaker may be utilized to fulfill the speaker's request in response to recognition of a hot word and the speaker. Particular participants may be provided with conference control privileges that are not provided to other participants. Upon recognition of a hot word, the speaker may be identified to determine if the speaker is authorized to invoke the conference feature associated with the hot word.

REFERENCES:
patent: 5373555 (1994-12-01), Norris et al.
patent: 5784546 (1998-07-01), Benman, Jr.
patent: 5812659 (1998-09-01), Mauney et al.
patent: 5822727 (1998-10-01), Garberg et al.
patent: 5892813 (1999-04-01), Morin et al.
patent: 5903870 (1999-05-01), Kaufman
patent: 5916302 (1999-06-01), Dunn et al.
patent: 5999207 (1999-12-01), Rodriguez et al.
patent: 6073101 (2000-06-01), Maes
patent: 6273858 (2001-08-01), Fox et al.
patent: 6347301 (2002-02-01), Bearden, III et al.
patent: 6359612 (2002-03-01), Peter et al.
patent: 6374102 (2002-04-01), Brachman et al.
patent: 6535730 (2003-03-01), Chow et al.
patent: 6587683 (2003-07-01), Chow et al.
patent: 6591115 (2003-07-01), Chow et al.
patent: 6606493 (2003-08-01), Chow et al.
patent: 6654447 (2003-11-01), Dewan
patent: 6816468 (2004-11-01), Cruickshank
patent: 6819945 (2004-11-01), Chow et al.
patent: 6853716 (2005-02-01), Shaffer et al.
patent: 7085717 (2006-08-01), Kepuska et al.
patent: 7133512 (2006-11-01), Creamer et al.
patent: 7136684 (2006-11-01), Matsuura et al.
patent: 7187762 (2007-03-01), Celi et al.
patent: 7286990 (2007-10-01), Edmonds et al.
patent: 7474634 (2009-01-01), Webster et al.
patent: 7583657 (2009-09-01), Webster et al.
patent: 7593520 (2009-09-01), Croak et al.
patent: 7617280 (2009-11-01), Webster et al.
patent: 7703104 (2010-04-01), Webster et al.
patent: 7792263 (2010-09-01), D'Amora et al.
patent: 7933226 (2011-04-01), Woodruff et al.
patent: 7949118 (2011-05-01), Edamadaka et al.
patent: 2001/0054071 (2001-12-01), Loeb
patent: 2003/0130016 (2003-07-01), Matsuura et al.
patent: 2003/0231746 (2003-12-01), Hunter et al.
patent: 2004/0105395 (2004-06-01), Friedrich et al.
patent: 2004/0218553 (2004-11-01), Friedrich et al.
patent: 2005/0170863 (2005-08-01), Shostak
patent: 2006/0069570 (2006-03-01), Allison et al.
patent: 2006/0165018 (2006-07-01), Gierach et al.
patent: 2007/0121530 (2007-05-01), Vadlakonda et al.
patent: 2007/0133437 (2007-06-01), Wengrovitz et al.
patent: 2008/0133245 (2008-06-01), Proulx et al.
patent: 2008/0232556 (2008-09-01), Gilbert et al.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

System, method, and computer-readable medium for verbal... does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with System, method, and computer-readable medium for verbal..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and System, method, and computer-readable medium for verbal... will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-4266476

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.