Method for the processing of radiological images in...

Image analysis – Applications – Biomedical applications

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C378S004000, C378S037000

Reexamination Certificate

active

07853064

ABSTRACT:
In an image-processing method for the detection of radiological signs in series of 3D data, an algorithm is used to detect radiological signs in a digital volume according to their contrasts. This algorithm is applied to reconstructed slices or directly to the series of projections. This algorithm is made by means of linear differential filters for signal analysis. It is used to color or enhance the intensity of the detected radiological signals according to the degree of malignancy.

REFERENCES:
patent: 5930330 (1999-07-01), Wolfe et al.
patent: 7218766 (2007-05-01), Eberhard et al.
patent: 7683338 (2010-03-01), Ueno et al.
patent: 2863749 (2005-06-01), None
Peters G et al, “Reconstruction-independent 3D CAS for calcification detection in digital breast tomosynthesis using fuzzy particles ”Progree in pattern recognition, Image Analysis and applications, 10th IberoAmerican Congress on Pattern Recognition, CIARP 2005 (Lecture Notes in Computer Science vol. 3773), Berlin, Germany, 2005, pp. 400-408, ISBN: 3-540-29850-9.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Method for the processing of radiological images in... does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Method for the processing of radiological images in..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Method for the processing of radiological images in... will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-4171692

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.