Method of multi-scale reconstruction of the image of the structu

Boots – shoes – and leggings

Patent

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

36441318, 36441313, G06F 1542

Patent

active

052414710

ABSTRACT:
In order to reconstruct the structure of a body by way of iterative algorithms, the first iterations are carried out with lower resolution in order to limit the number of calculations. The image can then be reconstructed when the calculation time is reduced to one-third or even one-quarter. The image thus reconstructed is of better quality than images reconstructed using only a single resolution.

REFERENCES:
patent: 4602348 (1986-07-01), Hart
patent: 4682290 (1987-07-01), Tan et al.
patent: 4751643 (1988-06-01), Lorensen et al.
patent: 4791567 (1988-12-01), Cline et al.
patent: 4821213 (1989-04-01), Cline et al.
patent: 4835712 (1989-05-01), Drebin et al.
patent: 4866612 (1989-09-01), Takagi et al.
patent: 4868764 (1989-09-01), Richards
patent: 4879668 (1989-11-01), Clire et al.
patent: 4914589 (1990-04-01), Crawford
patent: 4931959 (1990-06-01), Honda et al.
patent: 4952922 (1990-08-01), Griffin et al.
patent: 4984157 (1991-01-01), Cline et al.
patent: 4984160 (1991-01-01), Saint-Felix et al.
patent: 4985834 (1991-01-01), Cline et al.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Method of multi-scale reconstruction of the image of the structu does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Method of multi-scale reconstruction of the image of the structu, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Method of multi-scale reconstruction of the image of the structu will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-2302509

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.