Method and system for three-dimensional imaging using light...

Optics: measuring and testing – Shape or surface configuration – Triangulation

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C356S602000, C356S610000

Reexamination Certificate

active

06700669

ABSTRACT:

BACKGROUND OF THE INVENTION
1. Field of the Invention
The present invention is directed to three-dimensional surface profile imaging, and more particularly to a method and apparatus for three-dimensional imaging that uses color ranging to conduct surface profile measurement.
2. Description of the Related Art
A three dimensional surface profile imaging method and apparatus described in U.S. Pat. No. 5,675,407 (“the '407 patent”), the disclosure of which is incorporated herein by reference in its entirety, conducts imaging by projecting light through an optical filter, such as linear variable wavelength filter (LVWF), thereby projecting light having a known, spatially distributed wavelength spectrum on the objects being imaged. The LVWF is a rectangular optical glass plate coated with a color-filtering film that gradually varies in color, (i.e., wavelength). If the color spectrum of a LVWF is within the visible light region, one edge of the filter rectangle may correspond to the shortest visible wavelength (i.e. blue or violet) while the opposite edge may correspond to the longest visible wavelength, (i.e. red). The wavelength of light passing through the coated color-filtering layer is linearly proportional to the distance between the position on the filter glass where the light passes and the blue or red edge. Consequently, the color of the light is directly related to the angle &thgr;, shown in
FIG. 1
, at which the light leaves the rainbow projector and LVWF.
Referring to
FIG. 1
in more detail, the imaging method and apparatus is based on the triangulation principle and the relationship between a light projector
100
having the LVWF, a camera
102
, and the object being imaged
104
. As shown in
FIG. 1
, a triangle is uniquely defined by the angles theta (&thgr;) and alpha (&agr;), and the length of the baseline (B). With known values for &thgr;, &agr;, and B, the distance (i.e., the range R) between the camera
102
and a point Q on the object's surface can be easily calculated. Because the baseline B is predetermined by the relative positions of the light projector
100
and the camera
102
, and the value of &agr; can be calculated from the camera's geometry, the key to the triangulation method is to determine the projection angle, &thgr;, from an image captured by the camera
102
and more particularly to determine all &thgr; angles corresponding to all the visible points on an object's surface in order to obtain a full-frame 3D image in one snapshot.
FIG. 2
is a more detailed version of FIG.
1
and illustrates the manner in which all visible points on the object's surface
104
is obtained via the triangulation method. As can be seen in the Figure, the light projector
100
generates a fan beam of light
200
. The fan beam
200
is broad spectrum light (i.e., white light) which passes through the LVWF to illuminate one or more three-dimensional objects
104
in the scene with a pattern of light rays possessing a rainbow-like spectrum distribution. The fan beam of light
200
is composed of multiple vertical planes of light, or “light sheets”, each plane having a given projection angle and wavelength. Because of the fixed geometric relationship among the light source
102
, the lens of the camera
102
, and the LVWF, there exists a one-to-one correspondence between the projection angle (&thgr;) of the vertical plane of light and the wavelength (&lgr;) of the light ray. Note that although the wavelength variations are shown in
FIG. 2
to occur from side to side across the object
104
being imaged, it will be understood by those skilled in the art that the variations in wavelength could also be made from top to bottom across the object
104
being imaged.
The light reflected from the surface of the object
104
is then detected by the camera
102
. If a visible spectrum range LVWF (400-700 nm) is used, the color detected by the camera pixels is determined by the proportion of its primary color Red, Green, and Blue components (RGB). The color spectrum of each pixel has a one-tone correspondence with the projection angle (&thgr;) of the plane of light due to the fixed geometry of the camera
102
lens and the LVWF characteristics. Therefore, the color of light received by the camera
102
can be used to determine the angle &thgr; at which that light left the rainbow light projector. Other spectrum ranges can also be used in similar fashion. The implementation is straightforward to those skilled in the art.
As described above, the angle a is determined by the physical relationship between the camera
102
and the coordinates of each pixel on the camera's imaging plane. The baseline B between the camera's
102
focal point and the center of the cylindrical lens of the light projector
100
is fixed and known. Given the value for angles &agr; and &thgr;, together with the known baseline length B, all necessary information is provided to easily determine the full frame of three-dimensional range values (x,y,z) for any and every visible spot on the surface of the objects seen by the camera
102
.
As shown in
FIG. 3
, given the projection angle &thgr;, the three-dimensional algorithm for determining the (x,y,z) coordinates of any surface spot Q(x,y,z) on a three-dimensional object is given below based on the following triangulation principle:
x
=
B
f
*
ctg



θ
-
u
*
u
,


y
=
B
f
*
ctg



θ
-
u
*
v
,


z
=


B
f
*
ctg



θ
-
u
*
f


(
1
)
As a result, the three dimensional imaging system described above can capture full-frame, high spatial resolution three-dimensional images using a standard camera, such as a charge coupled device camera, in real time without relying on any moving parts. Further, because the imaging system does not rely on a laser, it does not pose any hazard to the eyes when used in clinical applications. Also, because the wavelength of the light projected onto the object surface continuously varies, there is no theoretical limitation on the measurement accuracy that can be achieved by the system. The actual accuracy of a specific system will depend on system implementation and will be affected primarily by limiting factors such as the optical system design, the quality and resolution of the camera, light spectral emission of the light source projector; noise level and resolution of the frame grabber, calibration algorithms, and the three-dimensional imaging processing algorithms.
To avoid allowing the ambient light on the object being imaged from affecting the imaging results, the system may obtain an image of the object under normal light conditions before projecting the filtered light onto the object. The image obtained under normal light conditions is then subtracted from the image obtained under LVWF light conditions to eliminate the effects of the ambient light on the image.
Referring back to
FIG. 1
, the triangulation algorithm used in the imaging system is based on the following formula, with reference to FIG.
3
:
R
=
sin



θ
sin



α

B
,
(
2
)
where (x
p
, y
p
) is the location of the rainbow light projector, (x
c
, y
c
) is the location of imaging sensor, B is the baseline between the rainbow projector and the imaging sensor (CCD), &agr;=&pgr;−&thgr;−&bgr;, O is a surface point on the object in the scene, and R is the three-dimensional range, that is, the distance between (x
c
, y
c
) and O.
Note that all of the variables, &thgr;, &agr;, and &bgr;, in the equation 10 may introduce error in the three-dimensional range calculation. In the following error sensitivity analysis, considered with reference to
FIG. 4
, it is assumed that the coordinate of the camera's focal point in the world coordinate system can be obtained precisely through camera calibration. The full derivative of the R is given by:

R
=
[
sin



θ



cos



α
-
sin
2

α

b
]




&a

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

Method and system for three-dimensional imaging using light... does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Method and system for three-dimensional imaging using light..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Method and system for three-dimensional imaging using light... will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-3266595

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.