Common aperture fused reflective/thermal emitted sensor and...

Radiant energy – Infrared-to-visible imaging – Including detector array

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C250S330000

Reexamination Certificate

active

06781127

ABSTRACT:

FIELD OF THE INVENTION
This invention relates to an integrated system comprising simultaneous co-registered sensing of thermally emitted radiation and sensing of reflected radiation in the visible/NIR/SWIR from the same view and fusing these sensing modalities to provide important visual information to an observer, or to an automated image processing system for interpreting the scene.
BACKGROUND OF THE INVENTION
For the most part, electromagnetic radiation sensed in an outdoor or indoor scene in the visible, near infrared (NIR) and shortwave infrared (SWIR) spectrums results from reflection, while radiation sensed at wavelengths from 3-15 microns mostly results from thermal emission. There are a number of exceptions, such as visible emission from the sun and the presence of significant reflection in the 3-5 micron midwave infrared (MWIR) spectrum, but the complementarity of reflection and thermal emission is generally acknowledged with respect to the visible/NIR/SWIR and wavelengths above 3 microns.
The advantages of having sensing capability for both reflected radiation and thermal emission has been noted in some recent patents. In U.S. Pat. No. 5,534,696 a sight apparatus is designed so that a viewer can better observe thermal IR imagery in the context of a direct visible view of a scene. In U.S. Pat. No. 6,020,994 an integrated apparatus includes the capability of an observer to switch between direct visible view and thermal IR for wide and narrow fields of view. In U.S. Pat. No. 5,944,653 visible and thermal IR views are boresighted within an endoscopic system. U.S. Pat. No. 5,808,350 teaches a design for integrating IR, visible and NIR sensing in the same focal plane array.
A number of computational algorithms for image fusion have already been developed for visible and thermal IR imagery. See for example:
(1) L.~van~Ruyven A.~Toet and J.~Valeton. Merging thermal and visual images by a contrast pyramid. Optical Engineering, 28(7):789-792, 1989.
(2) D.~Fay J. Racamato J. Carrick M.~Seibert A.~Waxman, A.~Gove and E.~Savoye. Color night vision: Opponent processing in the fusion of visible and {R} imagery. Neural Networks}, 10(1):1-6, 1997.
(3) P.~Burt and R.~Lolczynski. Enhanced image capture through fusion. Proceedings of IEEE 4th International Conference on Computer Vision, volume~4, pages 173-182, 1993.
(4) J.~Schuler M.~Satyshur D.~Scribner, P.~Warren and M.~Kruer. Infrared color vision: An approach to sensor fusion. Optics and Photonics News, August 1998.
(5) B.~S.~Manjunath H.~Lui and S.~K. Mitra. Multi-sensor image fusion using the wavelet transform. Proceedings IEEE International Conference on Image Processing, pages 51-55, 1994.
(6) D.~Socolinsky and L.~B. Wolff. Visualizing local contrast for multispectral imagery. Pending U.S. patent application, 1998.
(7) D.~A. Socolinsky and L.~B. Wolff. Optimal grayscale visualization of local contrast in multispectral imagery. Proceedings: DARPA Image Understanding Workshop, pages 761-766, Monterey, November 1998.
(8) D.~A. Socolinsky and L.~B. Wolff. A new paradigm for multispectral image visualization and data fusion. Proceedings:CVPR '99}, Fort Collins, June 1999.
(9) A.~Toet. Hierarchical image fusion. Machine Vision and Applications, pages 1-11, March 1990 .
(10) A.~Toet. New false color mapping for image fusion. Optical Engineering}, 35(3):650-658, 1996.
(11) H. A. MacLeod, “Thin Film optical filters”, Institute of Physics Publishers, 3
rd
edition, March 2001
References (
2
), (
4
) and (
10
) have proposed psychophysically motivated image fusion including the use of neural network approaches. References (
3
) and (
5
) develop wavelet image fusion methods. References (
1
) and (
9
) develop hierarchical image fusion algorithms. References (
6
), (
7
) and (
8
) develop image fusion algorithms that combine first-order contrast.
SUMMARY OF THE INVENTION
The imaging modalities of visible/NIR/SWIR and of thermal IR reveal complementary physical information with respect to one another for most typical scenes; visible/NIR/SWIR imagery senses reflected light radiation while thermal IR imagery senses mostly thermally emitted radiation, Fusing these imaging modalities using optics, sensor hardware and image processing algorithms can provide large advantages for human visual enhancement and automated image understanding.
This invention relates to a sensor system design that integrates optics, sensing hardware and computational processing to achieve optimum utilization of complementary information provided by the fusion of visible/NIR/SWIR and thermal IR imagery. This is accomplished through accurate co-registration of these respective modalities and then either optimum presentation/visualization to an observer or output of accurately co-registered information to an automated image understanding system. In the absence of a monolithic device that can simultaneously sense visible/NIR/SWIR and thermal IR at a pixel, two separate sensing arrays must be brought into exact alignment such that corresponding pixels view exactly the same scene element. Previous inventions, although sometimes citing common optical systems, do not achieve nor emphasize the importance of accurate co-registration for reflective and thermal emission.
Boresighted sensing attempts to image the same scene with two different imaging modalities placed side-by-side. Although both sensors are close in proximity, the view orientation and magnification respective to both sensors being slightly different makes co-registration dependent upon external 3-D depth of scene elements, which is almost always unknown and changes from scene-to-scene. Single window systems suffer the same co-registration problems as they require separate focusing optics for respective focal plane sensing arrays. Apart from ever present differences in magnification and distortion, separate focusing optics always creates a small stereo baseline between focal plane arrays which means that co-registration will not ‘track’ with depth in a scene.
The accurate co-registration between a subspectrum of visible/NIR/SWIR and a subspectrum of thermal IR for the first time enables the application of computational fusion algorithms such as those described by References
1
-
10
listed above, which produce composite visualizations of dual reflective\thermal IR imagery. Accurate co-registration also enables automated image understanding algorithms to perform computations including optic flow, tracking, biometric recognition, automatic target recognition.
A way to achieve accurate co-registration independent of depth in a scene is, for all focusing optics, to be common to both focal plane arrays. This can be achieved by using a single objective lens at the front-end of the apparatus through which all sensed radiation is focused onto respective focal plane arrays. A dichroic beamsplitter merely directs the appropriate subspectrum of incident radiation onto the corresponding sensing array and is optically a focal.
With depth independent co-registration the co-registration mapping between both focal plane arrays is an affine linear transformation having the following form:
(
X
2
Y
2
1
)
=
(
A
B
-
T
x
C
D
-
T
y
0
0
1
)



(
X
1
Y
1
1
)
where image coordinates (X
1
, Y
1
) on the 1
st
image plane are mapped to image coordinates (X
2
, Y
2
) on the 2
nd
image plane. The parameters T
x
and T
y
are respectively translation in x and y, while the upper left 2×2 submatrix can be decomposed as the product of a rotation by angle &thgr; and magnifications S
x
and S
y
in x and y respectively, according to:
(
A
B
C
D
)
=
(
cos



θ
-
sin



θ
sin



θ
cos



θ
)



(
S
x
0
0
S
y
)
The scaling parameters account for differences in physical horizontal and vertical pixel size between the two focal plane arrays, and, rotation and translation parameters account for corresponding relative rotation and translation of the two focal plane arrays with respect to one another.
One of th

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Common aperture fused reflective/thermal emitted sensor and... does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Common aperture fused reflective/thermal emitted sensor and..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Common aperture fused reflective/thermal emitted sensor and... will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-3358947

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.