Orientation invariant feature detection system and method...

Image analysis – Applications – Personnel identification

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C382S162000, C382S260000, C382S308000, C348S406100, C348S407100, C348S663000, C375S240160

Reexamination Certificate

active

06970579

ABSTRACT:
A system and method for detecting features in video from business meetings and video teleconferencing systems can use color, motion, morphological (shape) filtering and a set of heuristic rules to determine possible features in the scene. One embodiment includes obtaining a first video frame of images and a second video frame of images, performing a luminance conversion on the first video frame of images and the second video frame of images, performing a motion filter operation on the converted first and second frames of images, performing a chrominance normalization on the second video frame of images, performing a color filter operation on the normalized second video frame of images, and processing the first and second frames of video images after the motion filter and color filter operations.

REFERENCES:
patent: 5949916 (1999-09-01), Chun
patent: 6421384 (2002-07-01), Chung et al.
patent: 6678009 (2004-01-01), Kahn
patent: 6681032 (2004-01-01), Bortolussi et al.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Orientation invariant feature detection system and method... does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Orientation invariant feature detection system and method..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Orientation invariant feature detection system and method... will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-3468876

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.