Image processing device and method for sensing moving objects an

Facsimile and static presentation processing – Facsimile – Specific signal processing circuitry

Patent

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

358139, H04N 718, H04N 1700

Patent

active

052125475

ABSTRACT:
An image processing device and method for detecting moving objects against complex moving backgrounds, and a rangefinder based thereon are disclosed. The device and method are applicable to rapid and selective evaluation of complex visual fields, without the need for extensive numerical processing. The disclosed image processing method comprises the steps of obtaining sequential video frame data; storing video data in memory for a consecutive series of video frames obtained over a predetermined time interval in the past at a predetermined sampling rate; obtaining a standard image by averaging the video data stored in memory for the predetermined time interval so as to obtain video data representing a composite image for the predetermined time interval; and subtracting the video data for the standard image from video data for a current video frame.

REFERENCES:
patent: 4729021 (1988-03-01), Kondo
patent: 5023712 (1991-06-01), Kajiwara
patent: 5023713 (1991-06-01), Nishigori

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Image processing device and method for sensing moving objects an does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Image processing device and method for sensing moving objects an, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Image processing device and method for sensing moving objects an will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-807640

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.