Method of extracting temporal and spatial combination...

Image analysis – Image transformation or preprocessing – General purpose image processor

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C382S282000, C382S284000, C382S266000, C382S180000

Reexamination Certificate

active

06728427

ABSTRACT:

BACKGROUND OF THE INVENTION
1. Field of the Invention
The present invention relates to a method of extracting temporal and spatial combination markers for watershed algorithms which can be used when dividing morphological images. In particular, the present invention relates to a method of extracting temporal and spatial markers for marker extraction which are used as reference regions for watershed algorithms or a region growing when dividing images using morphological tools such as morphological filters and the watershed algorithms.
2. Information Disclosure Statement
Conventional marker extraction methods for a morphological image division is to extract regions the size of which uniform in brightness is larger than given critical values as markers or extract portions in which contrast of the brightness value is large as markers, after simplifying brightness images thereof. In these conventional marker extraction methods, in case that the brightness value of static background and moving objects are similar, the markers are not extracted as dependent ones in which some portions of static background and moving objects are different each other, but are extracted as a shared one.
Therefore, when applying watershed algorithms to these extracted markers, there occurs a problem that some portions of the moving objects are divided into some of the background regions or otherwise some portions of the background regions are erroneously divided into moving objects.
SUMMARY OF THE INVENTION
It is an object of the present invention to solve the problems involved in the prior art, and to provide a temporal and spatial marker combination extraction method by which a visually meaningful marker extraction is made possible by extracting markers simultaneously using spatial and temporal information, and thus a more exact and meaningful image division is made possible when performing a morphological image division using these extracted temporal and spatial combination markers.
In order to achieve the above object, the method of extracting temporal and spatial combination markers according to the present invention comprises the steps of: (1) simplifying inputted current image frames, quantizing the simplified images, and then extracting spatial markers from the simplified and quantized image frames using spatial information; (2) extracting temporal markers from inputted current image frames and previous image frames using temporal information; and (3) combining the extracted spatial marker and the extracted temporal marker to extract temporal and spatial markers.


REFERENCES:
patent: 4791675 (1988-12-01), Deering et al.
patent: 5552829 (1996-09-01), Kim et al.
patent: 5638125 (1997-06-01), Jeong et al.
patent: 5721692 (1998-02-01), Nagaya et al.
patent: 5734739 (1998-03-01), Sheehan et al.
patent: 5974200 (1999-10-01), Zhou et al.
patent: 6266442 (2001-07-01), Laumeyer et al.
patent: 6434254 (2002-08-01), Wixson
patent: 0 771 118 (1997-02-01), None

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Method of extracting temporal and spatial combination... does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Method of extracting temporal and spatial combination..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Method of extracting temporal and spatial combination... will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-3239565

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.