Method for combining multi-modal queries for search of...

Data processing: database and file management or data structures – Database design – Data structure types

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C707S793000, C707S793000

Reexamination Certificate

active

06507838

ABSTRACT:

BACKGROUND OF THE INVENTION
1. Field of the Invention
The present invention generally relates to systems that search different modes of media (audio, text, etc.) based upon a query and more particularly to an improved system that ranks search results based upon a time overlap between the matches in different media modes.
2. Description of the Related Art
Information can be stored in many different forms (modes). Before the advent of audio recordings, the only form to record information was the written word (or written symbols) or numbers. Subsequently, audio and video recordings were used to supplement or replace written information. Regardless of the mode in which information is recorded, there is always a need to search the information so that only relevant portions need to be reviewed when the user has a question (query) on a very specific topic.
Conventional searches primarily involve key word queries of previously created text or textural summaries. Thus, it is common to perform a simple Boolean combination such as AND/OR, or perform a search based on individual relevance scores of the textural data. However, with the increasing use of different media modes to record information, there is a need to logically search video, audio, and graphics, as well as textual information. The invention described below provides a method and system in which different media modes are searched and their results combined to give a response to a query in a way that exploits the co-occurrence in time of the matches based on individual media modes.
SUMMARY OF THE INVENTION
It is, therefore, an object of the present invention to provide a structure and method for searching multi-media including audio, video, graphic display and written data using a query that comprises processing the individual media modes of the data against the query and retrieving candidate matches in the individual media modes that are marked with their relevance score and their time of occurrence in the media mode, identifying overlapping time periods in the individual media matches, combining their relevance scores and their noted overlap in time periods into a score for ranking the matches, and returning the higher overall score matches as overall candidate matches to the query.
This way of ranking takes into account errors in search of individual media modes by relying on an indication of a common location for a match using the individual media modal searches as seen through the evidence of a large amount of time overlap between the individual modal matches. The method admits any media mode in which an association of time to a match can be done. Thus if the data includes a video mode, and textual script mode, and is searched using a text query, it is assumed that the textual matches to the query can be assigned a time of occurrence with reference to the time of the video. Similarly, if the query requires a search of the audio mode of the data as queried through a text keyword, it is assumed that the heard matches to the queried textual keyword can be assigned a time of occurrence with reference to the time in the audio track.
A method of searching multi-media data having different modes using a query, the method including processing the multi-media data to extract relevance scores and time reference points of matches to individual media modes, identifying overlapping time periods when two or more of the modal matches correspond to the query, and ranking a relevance of the overlapping time periods. The ranking includes finding an overlapping time period having a highest relevance score, segmenting the overlapping time period to identify beginning and ending events, calculating a relevance distribution based on a frequency of occurrence of the query in a time period, and finding a largest number of different modes of overlap. The modes include two or more of audio, video, text, and graphic display. The query can have an input mode based on any of the modes and the method further includes outputting results of the query in a mode consistent with the input mode.
The method can also comprise searching multi-media including audio, video, graphic display and written data using a query, processing the multi-media data to extract relevance scores and time reference points, identifying portions of the matching media modes that correspond to the query, determining a relevance score for the matching mode, assigning time periods for the matching mode, identifying overlapping time periods, determining a relevance timing score of the overlapping time periods, and ranking the matching modes based on the relevance score and the relevance timing score.


REFERENCES:
patent: 5644686 (1997-07-01), Hekmatpour
patent: 5802361 (1998-09-01), Wang et al.
patent: 5983214 (1999-11-01), Lang et al.
patent: 6029195 (2000-02-01), Herz
patent: 6243724 (2001-06-01), Mander et al.
patent: 6381605 (2002-04-01), Kothuri et al.
patent: 6389168 (2002-05-01), Altunbasak et al.
patent: 6404925 (2002-06-01), Foote et al.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Method for combining multi-modal queries for search of... does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Method for combining multi-modal queries for search of..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Method for combining multi-modal queries for search of... will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-3033273

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.