System and method for non-rigid multi-modal registration on...

Image analysis – Image transformation or preprocessing – Changing the image coordinates

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C382S168000, C382S260000, C345S651000

Reexamination Certificate

active

07813592

ABSTRACT:
A method for non-rigid multi-modal registration of digitized images includes providing a reference image and an alignment images acquired from different imaging modalities to a graphics processing unit (GPU), initializing a deformation field for registering said reference image and said alignment image, computing marginal and joint intensity histograms of the reference image and the alignment image as registered by said deformation field, computing gradients of the reference and registered alignment images and of their respective marginal and joint intensity histograms, smoothing said histograms and gradients using Gaussian filters, calculating a new deformation field using said smoothed gradients, and registering said alignment image to said reference image using said deformation field.

REFERENCES:
patent: 7280710 (2007-10-01), Castro-Pareja et al.
patent: 7639896 (2009-12-01), Sun et al.
patent: 7653264 (2010-01-01), Hero et al.
patent: 2006/0110071 (2006-05-01), Ong et al.

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

System and method for non-rigid multi-modal registration on... does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with System and method for non-rigid multi-modal registration on..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and System and method for non-rigid multi-modal registration on... will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-4158178

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.