Optics: measuring and testing – Range or remote distance finding – With photodetection
Reexamination Certificate
1999-09-22
2001-11-27
Buczinski, Stephen C. (Department: 3662)
Optics: measuring and testing
Range or remote distance finding
With photodetection
C356S005080, C356S005040
Reexamination Certificate
active
06323942
ABSTRACT:
FIELD OF THE INVENTION
The invention relates generally to range finder type image sensors, and more particularly to such sensors as may be implemented on a single integrated circuit using CMOS fabrication.
BACKGROUND OF THE INVENTION
Electronic circuits that provide a measure of distance from the circuit to an object are known in the art, and may be exemplified by system
10
FIG.
1
. In the generalized system of
FIG. 1
, imaging circuitry within system
10
is used to approximate the distance (e.g., WP, Z
2
, Z
3
) to an object
20
, the top portion of which is shown more distant from system
10
than is the bottom portion. Typically system
10
will include a light source
30
whose light output is focused by a lens
40
and directed toward the object to be imaged, here object
20
. Other prior art systems do not provide an active light source
30
and instead rely upon and indeed require ambient light reflected by the object of interest.
Various fractions of the light from source
30
may be reflected by surface portions of object
20
, and is focused by a lens
50
. This return light falls upon various detector devices
60
, e.g., photodiodes or the like, in an array on an integrated circuit (IC)
70
. Devices
60
produce a rendering of the luminosity of an object (e.g.,
10
) in the scene from which distance data is to be inferred. In some applications devices
60
might be charge coupled devices (CCDs) or even arrays of CMOS devices.
CCDs typically are configured in a so-called bucket-brigade whereby light-detected charge by a first CCD is serial-coupled to an adjacent CCD, whose output in turn is coupled to a third CCD, and so on. This bucket-brigade configuration precludes fabricating processing circuitry on the same IC containing the CCD array. Further, CCDs provide a serial readout as opposed to a random readout. For example, if a CCD range finder system were used in a digital zoom lens application, even though most of the relevant data would be provided by a few of the CCDs in the array, it would nonetheless be necessary to readout the entire array to gain access to the relevant data, a time consuming process. In still and some motion photography applications, CCD-based systems might still find utility.
As noted, the upper portion of object
20
is intentionally shown more distant that the lower portion, which is to say distance Z
3
>Z
3
>Z
1
. In an range finder autofocus camera environment, devices
60
approximate average distance from the camera (e.g., from Z=0) to object
10
by examining relative luminosity data obtained from the object. In
FIG. 1
, the upper portion of object
20
is darker than the lower portion, and presumably is more distant than the lower portion. In a more complicated scene, focal distance to an object or subject standing against a background would be approximated by distinguishing the subject from the background by a change in luminosity. In a range finding binocular application, the field of view is sufficiently small such that all objects in focus are at substantially the same distance. In the various applications, circuits
80
,
90
,
100
within system
10
would assist in this signal processing. As noted, if IC
70
includes CCDs
60
, other processing circuitry such as
80
,
90
,
100
are formed off-chip.
Unfortunately, reflected luminosity data does not provide a truly accurate rendering of distance because the reflectivity of the object is unknown. Thus, a distant object surface with a shiny surface may reflect as much light (perhaps more) than a closer object surface with a dull finish.
Other focusing systems are known in the art. Infrared (IR) autofocus systems for use in cameras or binoculars produce a single distance value that is an average or a minimum distance to all targets within the field of view. Other camera autofocus systems often require mechanical focusing of the lens onto the subject to determine distance. At best these prior art focus systems can focus a lens onto a single object in a field of view, but cannot simultaneously measure distance for all objects in the field of view.
In general, a reproduction or approximation of original luminosity values in a scene permits the human visual system to understand what objects were present in the scene and to estimate their relative locations stereoscopically. For non-stereoscopic images such as those rendered on an ordinary television screen, the human brain assesses apparent size, distance and shape of objects using past experience. Specialized computer programs can approximate object distance under special conditions. Stereoscopic images allow a human observer to more accurately judge the distance of an object. However it is challenging for a computer program to judge object distance from a stereoscopic image. Errors are often present, and the required signal processing require specialized hardware and computation. Stereoscopic images are at best an indirect way to produce a three-dimensional image suitable for direct computer use.
Many applications require directly obtaining a three-dimensional rendering of a scene. But in practice it is difficult to accurately extract distance and velocity data along a viewing axis from luminosity measurements. Nonetheless many application require accurate distance and velocity tracking, for example an assembly line welding robot that must determine the precise distance and speed of the object to be welded. The necessary distance measurements may be erroneous due to varying lighting conditions and other shortcomings noted above. Such applications would benefit from a system that could directly capture three-dimensional imagery.
Although specialized three dimensional imaging systems exist in the nuclear magnetic resonance and scanning laser tomography fields, such systems require substantial equipment expenditures. Further, these systems are obtrusive, and are dedicated to specific tasks, e.g., imaging internal body organs.
In other applications, scanning laser range finding systems raster scan an image by using mirrors to deflect a laser beam in the x-axis and perhaps the y-axis plane. The angle of defection of each mirror is used to determine the coordinate of an image pixel being sampled. Such systems require precision detection of the angle of each mirror to determine which pixel is currently being sampled. Understandably having to provide precision moving mechanical parts add bulk, complexity, and cost to such range finding system. Further, because these systems sample each pixel sequentially, the number of complete image frames that can be sampled per unit time is limited.
In summation, there is a need for a system that can produce direct three-dimensional imaging. Preferably such system should be implementable on a single IC that includes both detectors and circuitry to process detection signals. Such single IC system should be implementable using CMOS fabrication techniques, should require few discrete components and have no moving components. Optionally, the system should be able to output data from the detectors in a non-sequential or random fashion.
The present invention provides such a system.
SUMMARY OF THE PRESENT INVENTION
The present invention provides a system that measures distance and velocity data in real time using time-of-flight (TOF) data rather than relying upon luminosity data. The system is CMOS-compatible and provides such three-dimensional imaging without requiring moving parts. The system may be fabricated on a single IC containing both a two-dimensional array of CMOS-compatible pixel detectors that sense photon light energy, and processing circuitry. A microprocessor on the IC continuously triggers a preferably LED or laser light source whose light output pulses are at least partially reflected by points on the surface of the object to be imaged.
An optical system including an off-chip filter and lens that focuses incoming light ensures that each pixel detector in the array receives light only from a single point on the surface of the imaged object, e.g., all optical paths are equa
Buczinski Stephen C.
Canesta, Inc.
LandOfFree
CMOS-compatible three-dimensional image sensor IC does not yet have a rating. At this time, there are no reviews or comments for this patent.
If you have personal experience with CMOS-compatible three-dimensional image sensor IC, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and CMOS-compatible three-dimensional image sensor IC will most certainly appreciate the feedback.
Profile ID: LFUS-PAI-O-2601257