Image segmentation

Image analysis – Image segmentation

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C382S203000, C382S199000, C382S295000

Reexamination Certificate

active

07010164

ABSTRACT:
A method of segmenting a selected region from a multi-dimensional dataset, which method comprises the steps of setting up a shape model representing the general outline of the selected region and setting up an adaptive mesh. The adaptive mesh represents an approximate contour of the selected region. The adaptive mesh is initialized on the basis of the shape model. Furthermore, the adaptive mesh is deformed in dependence on the shape model and on feature information of the selected region.

REFERENCES:
patent: 5654771 (1997-08-01), Tekalp et al.
patent: 5768413 (1998-06-01), Levin et al.
patent: 6078680 (2000-06-01), Yoshida et al.
patent: 6088472 (2000-07-01), O'Donnell et al.
patent: 6124864 (2000-09-01), Madden et al.
patent: 6201543 (2001-03-01), O'Donnell et al.
patent: 6404920 (2002-06-01), Hsu
patent: 2003/0099397 (2003-05-01), Matsugu et al.
patent: 1030191 (2000-08-01), None
Lurig et al: “Deformable surfaces for feature based indirect volume rendering” Computer Graphics International, 1998. Proceedings Jun. 22, 1998, pp. 752-760.
“An efficient 3D deformable model with a self-optimising mesh” by A.J. Bulpitt and N.E. Efford in Image and Vision Computing 14(1996) pp. 573-580.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Image segmentation does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Image segmentation, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Image segmentation will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-3599975

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.