Method and apparatus for stereologic analysis of two-dimensional

Image analysis – Pattern recognition – Feature extraction

Patent

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

382194, G06K 946, G06K 966

Patent

active

057546886

ABSTRACT:
A method and apparatus for analyzing two-dimensional structures to obtain three-dimensional quantitative information therefrom includes a video camera, a digitizing board for transforming video image output signals from the video camera into digital data, and a computer for analyzing the digitized video image data. Regions of interest or desired objects in the video image corresponding to the video image data are observed by an operator. Thresholding commands are supplied to the computer instructing the computer to manipulate the digitized video image data to enhance or distinguish the regions of interest from other areas in the video image. Once sufficient thresholding is accomplished, the computer executes an algorithm that identifies the regions of interest, draws a solid line around them, and sets all of the video image data defining each region of interest bounded by a solid line to a predetermined threshold value. Subsequently, a grid size is specified by the operator and an analysis grid is overlaid on the video image data. The video image data is analyzed at locations corresponding to the grid intersection locations. If a region of interest is found to exist at a grid intersection location, a point count value is incremented. Further, the computer counts the total regions of interest in the video image data. The video image data can be thresholded at other grayscale levels to identify other regions of interest in the video image data. Thus, other regions of interest can be quantified. Stereologic values such as N.sub.A (number per area), N.sub.V (number per volume), and V.sub.V (volume per volume) can be derived from the results of the video image data analysis performed according to the present invention.

REFERENCES:
patent: 4054782 (1977-10-01), Weibel
patent: 4334274 (1982-06-01), Agui et al.
patent: 4453266 (1984-06-01), Bacus
patent: 5121436 (1992-06-01), Kasdan et al.
patent: 5159361 (1992-10-01), Cambier et al.
patent: 5231580 (1993-07-01), Cheung et al.
patent: 5307292 (1994-04-01), Brown et al.
Levine, Martin D. Vision in Man and Machine, McGraw-Hill, 1985, pp. 290-293.
N. D. Pentcheff et al., "Computer Assisted Data Collection for Stereology: Rationale and Description of Point Counting Stereology (PCS) Software," Microscopy Research and Technique, 21:347-354 (1992).
M.C. Poole, "An Image Processing/Stereological Analysis System for Transmission Electron Microscopy," Microscopy Research and Technique, 21:283-291 (1992).
D.M. Hyde, et al., "Computer-Assisted Morphometry: Point Intersection, and Profile Counting and Three-Dimensional Reconstruction," Microscopy Research and Technique, 21:262-270 (1992).
R.P. Bolender, "Biological Stereology: History, Present State, Future Directions," Microscopy Research and Technique, 21:255-261 (1992).
D.M. Hyde, et al., "Mophometric Assessment of Pulmonary Toxicity in the Rodent Lung," Toxicologic Pathology, vol. 19, No. 4 (Part I), pp. 428-446 (1991).
E. Weibel, Stereological Methods, vol. 1, Practical Methods for Biological Morphometry, Academic Press, Inc. (1979).

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Method and apparatus for stereologic analysis of two-dimensional does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Method and apparatus for stereologic analysis of two-dimensional, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Method and apparatus for stereologic analysis of two-dimensional will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-1861679

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.