Multi-modal content presentation

Data processing: presentation processing of document – operator i – Operator interface – For plural users or sites

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C715S251000, C715S700000, C715S727000

Reexamination Certificate

active

07487453

ABSTRACT:
A method is provided that includes receiving a user input, the user input having been input in a user interface in one of multiple modalities. The method also includes accessing, in response to receiving the user input, a multi-modality content document including content information and presentation information, the presentation information supporting presentation of the content information in each of the multiple modalities. In addition, the method includes accessing, in response to receiving the user input, metadata for the user interface, the metadata indicating that the user interface provides a first modality and a second modality for interfacing with a user. First-modality instructions are generated based on the accessed multi-modality content document and the accessed metadata, the first-modality instructions providing instructions for presenting the content information on the user interface using the first modality. Second-modality instructions are generated based on the accessed multi-modality content document and the accessed metadata, the second-modality instructions providing instructions for presenting the content information on the user interface using the second modality.

REFERENCES:
patent: 6745163 (2004-06-01), Brocious et al.
patent: 6807529 (2004-10-01), Johnson et al.
patent: 7020841 (2006-03-01), Dantzig et al.
patent: 7107533 (2006-09-01), Duncan et al.
patent: 7210098 (2007-04-01), Sibal et al.
patent: 2003/0046346 (2003-03-01), Mumick et al.
patent: 2003/0071833 (2003-04-01), Dantzig et al.
patent: 2003/0146932 (2003-08-01), Weng et al.
patent: 2005/0055635 (2005-03-01), Bargeron et al.
patent: 2005/0102606 (2005-05-01), Sasaki et al.
patent: 2005/0131911 (2005-06-01), Chi et al.
patent: 2005/0273759 (2005-12-01), Lucassen et al.
patent: 1 526 447 (2005-04-01), None
patent: 1 526 448 (2005-04-01), None
Hodas et al., NOVeLLA: A Multi-Modal Electronic-Book Reader with Visual and Auditory Interfaces, Apr. 6, 2001, International Journal of Speech Technology.
Eric Blechsmitt et al., “An Architecture to Provide Adaptive, Synchronized and Multimodal Human Computer Interaction”,Multimedia'02, published Dec. 1-6, 2002, Juan-Les-Pins, France, (pp. 287-290).
Robbie Schaefer et al., “A novel Dialog Model for the Design of Multimodal User Interfaces”, published Jul. 11-13, 2004, Hamburg, Germany, (2 pages).
“3G Mobile Context Sensitive Adaptability-User Friendly Mobile Work Place for Seamless Enterprise Applications”,Project No. IST-2001-32407, RIML Language Specification Version2, published Mar. 17, 2004, (pp. 1-105).
Nicolas Chevassus, “A framework for authoring & exploiting multimodal documentation”,W3C Seminar, published Jun. 21, 2005, (pp. 1-29).
Liu et al., “An Architecture of Wireless Web and Dialogue System Convergence for Multimodal Service Interaction Over Converged Networks,”Proceedings of the 27thAnnual International Computer Software and Applications Conference, Dallas, TX, 2003, 6 pages.
‘Multimodal Architecture and Interfaces: W3C Working Draft’ [online]. W3C, 2005, [retrieved on Apr. 17, 2007]. Retrieved from the Internet: <URL: http://www.w3.org/TR/2005/WD-mmi-arch-20050422/>, 15 pages.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Multi-modal content presentation does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Multi-modal content presentation, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Multi-modal content presentation will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-4076864

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.