Realtime stereo and motion analysis on passive video images...

Image analysis – Applications – 3-d or stereo imaging analysis

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

Reexamination Certificate

active

10665881

ABSTRACT:
A compact, inexpensive, real-time device for computing dense stereo range and motion field images, which are fundamental measurements supporting a wide range of computer vision systems that interact with the real world, where objects move through three-dimensional space includes a novel algorithm for image-to-image comparison that requires less storage and fewer operations than other algorithms. A combination of standard, low-cost and low-power components are programmed to perform algorithm and performs realtime stereo and motion analysis on passive video images, including image capture, digitization, stereo and/or motion processing, and transmission of results.

REFERENCES:
patent: 5179441 (1993-01-01), Anderson et al.
patent: 5432712 (1995-07-01), Chan
patent: 5577130 (1996-11-01), Wu
patent: 0 686 942 (1995-12-01), None
Computational stereo, by Barnard et al., computing surveys, vol. 14, No. 4, Dec. 1982.
A parallel stereo algorithm that produces dense depth maps and preserves image features, by Fua, machine viaion and applications (1993) 6:35-45.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Realtime stereo and motion analysis on passive video images... does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Realtime stereo and motion analysis on passive video images..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Realtime stereo and motion analysis on passive video images... will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-3737676

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.