Unsupervised scene segmentation

Image analysis – Image transformation or preprocessing – General purpose image processor

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C382S257000

Reexamination Certificate

active

07142732

ABSTRACT:
A method of segmenting objects in an image is described. The method applies a Top Hat algorithm to the image then constructs inner and outer markers for application to the original image in a Watershed algorithm. The inner marker is constructed using binary erosion. The outer marker is constructed using binary dilation and perimeterisation. The method finds particular application for first level segmentation of a cell nucleus prior to detailed analysis.

REFERENCES:
patent: 5257182 (1993-10-01), Luck et al.
patent: 5768407 (1998-06-01), Shen et al.
patent: 5850464 (1998-12-01), Vogt
patent: 5892841 (1999-04-01), Jochems et al.
patent: 6195659 (2001-02-01), Hyatt
patent: 6244764 (2001-06-01), Lei et al.
patent: 6363161 (2002-03-01), Laumeyer et al.
patent: 6400831 (2002-06-01), Lee et al.
patent: 6625315 (2003-09-01), Laumeyer et al.
P.T. Jackway,Improved Morphological Top-Hat, Electronics Letters, vol. 36(4), Jul. 2000, pp. 1194-1195, Institution of Electrical Engineers.
Qian et al.,Automatic Extraction of Coronary Artery Tree on Coronary Angiograms by Morphological Operators, Computers in Cardiology, vol. 25, 1998, pp. 765-768, IEEE.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Unsupervised scene segmentation does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Unsupervised scene segmentation, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Unsupervised scene segmentation will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-3635187

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.