Image analysis – Applications – 3-d or stereo imaging analysis
Reexamination Certificate
2001-01-26
2003-04-29
Johnson, Timothy M. (Department: 2625)
Image analysis
Applications
3-d or stereo imaging analysis
C356S370000, C356S370000
Reexamination Certificate
active
06556706
ABSTRACT:
BACKGROUND OF THE INVENTION
1. Field of the Invention
The present invention is directed to three-dimensional surface profile imaging, and more particularly to a method and apparatus for three-dimensional imaging that uses color ranging to conduct surface profile measurement.
2. Description of the Related Art
A three dimensional surface profile imaging method and apparatus described in U.S. Pat. No. 5,675,407 (“the '407 patent”), the disclosure of which is incorporated herein by reference in its entirety, conducts imaging by projecting light through a linear variable wavelength filter (LVWF), thereby projecting light having a known, spatially distributed wavelength spectrum on the objects being imaged. The LVWF is a rectangular optical glass plate coated with a color-filtering film that gradually varies in color, (i.e., wavelength). If the color spectrum of a LVWF is within the visible light region, one edge of the filter rectangle may correspond to the shortest visible wavelength (i.e. blue or violet) while the opposite edge may correspond to the longest visible wavelength, (i.e. red). The wavelength of light passing through the coated color-filtering layer is linearly proportional to the distance between the position on the filter glass where the light passes and the blue or red edge. Consequently, the color of the light is directly related to the angle &thgr;, shown in
FIG. 1
, at which the light leaves the rainbow projector and LVWF.
Referring to
FIGS. 1 and 2
in more detail, the imaging method and apparatus is based on the triangulation principle and the relationship between a light projector
100
that projects through the LVWF
101
, a camera
102
, and the object or scene being imaged
104
. As shown in
FIG. 1
, a triangle is uniquely defined by the angles theta (&thgr;) and alpha (&agr;), and the length of the baseline (B). With known values for &thgr;, &agr;, and B, the distance (i.e., the range R) between the camera
102
and a point Q on the object's surface can be easily calculated. Because the baseline B is predetermined by the relative positions of the light projector
100
and the camera
102
, and the value of &agr; can be calculated from the camera's geometry, the key to the triangulation method is to determine the projection angle, &thgr;, from an image captured by the camera
102
and more particularly to determine all &thgr; angles corresponding to all the visible points on an object's surface in order to obtain a full-frame 3D image in one snapshot.
FIG. 2
is a more detailed version of FIG.
1
and illustrates the manner in which all visible points on the object's surface
104
is obtained via the triangulation method. As can be seen in the Figure, the light projector
100
generates a fan beam of light
200
. The fan beam
200
is broad spectrum light (i.e., white light) which passes through the LVWF
101
to illuminate one or more three-dimensional objects
104
in the scene with a pattern of light rays possessing a rainbow-like spectrum distribution. The fan beam of light
200
is composed of multiple vertical planes of light
202
, or “light sheets”, each plane having a given projection angle and wavelength. Because of the fixed geometric relationship among the light source
100
, the lens of the camera
102
, and the LVWF
101
, there exists a one-to-one correspondence between the projection angle (&thgr;) of the vertical plane of light and the wavelength (&lgr;) of the light ray. Note that although the wavelength variations are shown in
FIG. 2
to occur from side to side across the object
104
being imaged, it will be understood by those skilled in the art that the variations in wavelength could also be made from top to bottom across the object
104
or scene being imaged.
The light reflected from the object
104
surface is then detected by the camera
102
. If a visible spectrum range LVWF (400-700 nm) is used, the color detected by the camera pixels is determined by the proportion of its primary color Red, Green, and Blue components (RGB). The color spectrum of each pixel has a one-to-one correspondence with the projection angle (&thgr;) of the plane of light due to the fixed geometry of the camera
102
lens and the LVWF
101
characteristics. Therefore, the color of light received by the camera
102
can be used to determine the angle &thgr; at which that light left the light projector
100
through the LVWF
101
.
As described above, the angle &agr; is determined by the physical relationship between the camera
102
and the coordinates of each pixel on the camera's imaging plane. The baseline B between the camera's
102
focal point and the center of the cylindrical lens of the light projector
100
is fixed and known. Given the value for angles &agr; and &thgr;, together with the known baseline length B, all necessary information is provided to easily determine the full frame of three-dimensional range values (x,y,z) for any and every visible spot on the surface of the objects
104
seen by the camera
102
.
As shown in
FIG. 3
, given the projection angle &thgr;, the three-dimensional algorithm for determining the (x,y,z) coordinates of any surface spot Q(x,y,z) on a three-dimensional object is given below based on the following triangulation principle:
x
=
B
f
*
ctg
⁢
⁢
θ
-
u
*
u
,


⁢
y
=
B
f
*
ctg
⁢
⁢
θ
-
u
*
v
,


⁢
z
=
B
f
*
ctg
⁢
⁢
θ
-
u
*
f
(
1
)
As a result, the three-dimensional imaging system described above can capture full-frame, high spatial resolution three-dimensional images using a standard camera, such as a charge coupled device camera, in real time without relying on any moving parts. Further, because the imaging system does not rely on a laser, it does not pose any hazard to the eyes when used in clinical applications. Also, because the wavelength of the light projected onto the object surface continuously varies, there is no theoretical limitation on the measurement accuracy that can be achieved by the system. The actual accuracy of a specific system will depend on system implementation and will be affected primarily by limiting factors such as the optical system design, the quality and resolution of the camera, light spectral emission of the light source projector; noise level and resolution of the frame grabber, calibration algorithms, and the three-dimensional imaging processing algorithms.
To avoid allowing the surface color of the object being imaged from affecting the imaging results, the system may obtain an image of the object under normal light conditions before projecting the filtered light onto the object. The image obtained under normal light conditions is then subtracted from the image obtained under LVWF light conditions to eliminate the effects of the object color on the image.
Even when the system compensates for the color of the object, however, the consistency of the spectral power distribution and the RGB value of each pixel may vary when light is projected onto the object through the LVWF based on the reflection characteristics of the object's surface, particularly if the object is not white and/or not uniformly colored.
There is a need for a surface profile imaging method and apparatus that is able to generate consistent RGB values regardless of the reflection characteristics of the surface being imaged.
SUMMARY OF THE INVENTION
Accordingly, the present invention is directed to a method and apparatus for three-dimensional surface imaging that avoids variations in the RGB value of each pixel due to the reflection characteristics of the object's surface. More particularly, a light source in the system illuminates an object or scene with a light pattern having a spatially varying wavelength and composed of at least one light plane. The light plane corresponds to at least one angle at which the light of that wavelength is emitted and contains only a single spectral component.
By imposing a single spectral light condition on the light source, the RGB values of each pixel will
Fish Paul W.
Geng Z. Jason
Johnson Timothy M.
Nichols Steven L.
Rader & Fishman & Grauer, PLLC
LandOfFree
Three-dimensional surface profile imaging method and... does not yet have a rating. At this time, there are no reviews or comments for this patent.
If you have personal experience with Three-dimensional surface profile imaging method and..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Three-dimensional surface profile imaging method and... will most certainly appreciate the feedback.
Profile ID: LFUS-PAI-O-3033411