Method for comparing two moving pictures and retrieval...

Pulse or digital communications – Bandwidth reduction or expansion – Television or motion video signal

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

Reexamination Certificate

active

06816551

ABSTRACT:

BACKGROUND OF THE INVENTION
1. Field of the Invention
The present invention relates to techniques for moving-picture retrieval, and in particular to a method for comparing two moving pictures.
2. Description of the Related Art
There have been proposed various moving-picture retrieval methods. For example, a similarity-based moving-picture reproduction technique has been disclosed in Japanese Patent Application Unexamined Publication No. 8-106543. This conventional technique relates to a method for automatically accessing to a target frame by computing the degree of similarity between a reference frame and each frame of a moving picture. Similarity-based retrieval is appropriate for visual information because plural pictures are not exactly matched but similar to a reference picture.
In the case of two moving pictures which are compared to compute the degree of similarity therebetween, however, there are developed the following problems.
First, in the case where at least one of the two moving pictures has a variable time interval between feature extracting time positions, it is impossible to precisely compute the degree of similarity.
Second, in the case where the two moving pictures have the same time interval between feature extracting time positions but have a high frame rate, it is necessary to select a large number of frames for feature extracting, resulting in increased scale of the moving-picture database.
Third, in the case where the two moving pictures have different frame rates, it is also impossible to precisely compute the degree of similarity. For example, among television schemes NTSC, PAL, and SECAM, different transmission frame rates, 50 Hz and 60 Hz, are prescribed. Since the frame-sampling positions for feature extraction are not coincident between different television schemes, precise computation of similarity cannot be obtained.
SUMMARY OF THE INVENTION
It is an object of the present invention to provide a moving-picture comparison method allowing the degree of similarity to be precisely computed.
It is an object of the present invention to provide a moving-picture comparison method and retrieval system allowing the degree of similarity to be precisely computed without increasing in scale of a moving-picture database.
According to the present invention, a method for comparing a first moving picture and a second moving picture, includes the steps of: a) generating first moving-picture feature data from first moving-picture data according to first timing information; b) generating second moving-picture feature data from second moving-picture data according to second timing information; c) changing at least one of the first moving-picture feature data and the second moving-picture feature data based on the first and second timing information so that one of the first moving-picture feature data and the second moving-picture feature data exists at a time position of the other; and d) comparing the first moving-picture feature data and the second moving-picture feature data.
According to an aspect of the present invention, a method for computing a degree of similarity between two moving pictures, includes the steps of: a) retrievably storing moving-picture data of each of a plurality of moving pictures and moving-picture feature data generated from the moving-picture data according to feature extraction timing information of the moving-picture data; b) inputting query moving-picture data and query feature extraction timing information; c) generating query moving-picture feature data from the query moving-picture data according to the query feature extraction timing information; d) reading moving-picture feature data of a selected one of -the moving pictures stored, as candidate moving-picture feature data, wherein the candidate moving-picture feature data was generated from the selected one according to candidate feature extraction timing information; e) changing at least one of the query moving-picture feature data and the candidate moving-picture feature data based on the query and candidate feature extraction timing information so that one of the query moving-picture feature data and the candidate moving-picture feature data exists at a time position of the other; and f) computing the degree of similarity between the query moving-picture feature data and the candidate moving-picture feature data, at least one of which has been changed at the step (e).
The step (e) may include the steps of: e.1) generating reference timing information; e.2) comparing time positions designated by the query feature extraction timing information with those designated by the reference timing information; e.3) changing the query moving-picture feature data so that the query moving-picture feature data exists at a time position designated by the reference timing information; e.4) comparing time positions designated by the candidate feature extraction timing information with those designated by the reference timing information: and e.5) changing the candidate moving-picture feature data so that the candidate moving-picture feature data exists at a time position designated by the reference timing information.
The step (e) may include the steps of: e.1) generating reference timing information from the query and candidate feature extraction timing information according to a predetermined rule; e.2) comparing time positions designated by the query feature extraction timing information with those designated by the reference timing information; e.3) changing the query moving-picture feature data so that the query moving-picture feature data exists at a time position designated by the reference timing information; e.4) comparing time positions designated by the candidate feature extraction timing information with those designated by the reference timing information; and e.5) changing the candidate moving-picture feature data so that the candidate moving-picture feature data exists at a time position designated by the reference timing information.
The step (e) may include the steps of: e.1) comparing time positions designated by the query feature extraction timing information with those designated by the candidate feature extraction timing information; and e.2) interpolating one of the query moving-picture feature data and the candidate moving-picture feature data so that an interpolated one exists at a time position of the other.
The step (e) may include the steps of: e.1) comparing time positions designated by the query feature extraction timing information with those designated by the candidate feature extraction timing information; e.2) selecting common time positions which are designated by both the query feature extraction timing information and the candidate feature extraction timing information; and e.3) changing at least one of the query moving-picture feature data and the candidate moving-picture feature data so that each of the query moving-picture feature data and the candidate moving-picture feature data exists only at the common time positions.
The step (e) may include the steps of: e.1) comparing time positions designated by the query feature extraction timing information with those designated by the candidate feature extraction timing information: e.2) generating expanded time positions which are designated by at least one of the query feature extraction timing information and the candidate feature extraction timing information; and e.3) changing at least one of the query moving-picture feature data and the candidate moving-picture feature data so that each of the query moving-picture feature data and the candidate moving-picture feature data exists at the expanded time positions.
According to another aspect of the present invention, a similarity-based retrieval system includes a database for retrievably storing moving-picture data of each of a plurality of moving pictures and moving-picture feature data generated from the moving--picture data according to feature extraction timing information of the moving-picture data; a feature extractor for extracting query moving-picture

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Method for comparing two moving pictures and retrieval... does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Method for comparing two moving pictures and retrieval..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Method for comparing two moving pictures and retrieval... will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-3305679

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.