CMOS-compatible three-dimensional image sensing using...

Optics: measuring and testing – Range or remote distance finding – With photodetection

Reexamination Certificate

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

C356S005040, C356S005080, C356S005090, C356S005100

Reexamination Certificate

active

06587186

ABSTRACT:

FIELD OF THE INVENTION
The invention relates generally to range finder type image sensors, and more particularly to such sensors as may be implemented on a single integrated circuit using CMOS fabrication, and especially to reducing power consumption of systems utilizing such sensors.
BACKGROUND OF THE INVENTION
Electronic circuits that provide a measure of distance from the circuit to an object are known in the art, and may be exemplified by system
10
FIG.
1
. In the generalized system of
FIG. 1
, imaging circuitry within system
10
is used to approximate the distance (e.g., Z
1
, Z
2
, Z
3
) to an object
20
, the top portion of which is shown more distant from system
10
than is the bottom portion. Typically system
10
will include a light source
30
whose light output is focused by a lens
40
and directed toward the object to be imaged, here object
20
. Other prior art systems do not provide an active light source
30
and instead rely upon and indeed require ambient light reflected by the object of interest.
Various fractions of the light from source
30
may be reflected by surface portions of object
20
, and is focused by a lens
50
. This return light falls upon various detector devices
60
, e.g., photodiodes or the like, in an array on an integrated circuit (IC)
70
. Devices
60
produce a rendering of the luminosity of an object (e.g.,
10
) in the scene from which distance data is to be inferred. In some applications devices
60
might be charge coupled devices (CCDs) or even arrays of CMOS devices.
CCDs typically are configured in a so-called bucket-brigade whereby light-detected charge by a first CCD is serial-coupled to an adjacent CCD, whose output in turn is coupled to a third CCD, and so on. This bucket-brigade configuration precludes fabricating processing circuitry on the same IC containing the CCD array. Further, CCDs provide a serial readout as opposed to a random readout. For example, if a CCD range finder system were used in a digital zoom lens application, even though most of the relevant data would be provided by a few of the CCDs in the array, it would nonetheless be necessary to readout the entire array to gain access to the relevant data, a time consuming process. In still and some motion photography applications, CCD-based systems might still find utility.
As noted, the upper portion of object
20
is intentionally shown more distant that the lower portion, which is to say distance Z
3
>Z
3
>Z
1
. In a range finder autofocus camera environment, one might try to have devices
60
approximate average distance from the camera (e.g., from Z=0) to object
10
by examining relative luminosity data obtained from the object. In some applications, e.g., range finding binoculars, the field of view is sufficiently small such that all objects in focus will be at substantially the same distance. But in general, luminosity-based systems do not work well. For example, in
FIG. 1
, the upper portion of object
20
is shown darker than the lower portion, and presumably is more distant than the lower portion. But in the real world, the more distant portion of an object could instead be shinier or brighter (e.g., reflect more optical energy) than a closer but darker portion of an object. In a complicated scene, it can be very difficult to approximate the focal distance to an object or subject standing against a background using change in luminosity to distinguish the subject from the background. In such various applications, circuits
80
,
90
,
100
within system
10
in
FIG. 1
would assist in this signal processing. As noted, if IC
70
includes CCDs
60
, other processing circuitry such as
80
,
90
,
100
are formed off-chip.
Unfortunately, reflected luminosity data does not provide a truly accurate rendering of distance because the reflectivity of the object is unknown. Thus, a distant object surface with a shiny surface may reflect as much light (perhaps more) than a closer object surface with a dull finish.
Other focusing systems are known in the art. Infrared (IR) autofocus systems for use in cameras or binoculars produce a single distance value that is an average or a minimum distance to all targets within the field of view. Other camera autofocus systems often require mechanical focusing of the lens onto the subject to determine distance. At best these prior art focus systems can focus a lens onto a single object in a field of view, but cannot simultaneously measure distance for all objects in the field of view.
In general, a reproduction or approximation of original luminosity values in a scene permits the human visual system to understand what objects were present in the scene and to estimate their relative locations stereoscopically. For non-stereoscopic images such as those rendered on an ordinary television screen, the human brain assesses apparent size, distance and shape of objects using past experience. Specialized computer programs can approximate object distance under special conditions.
Stereoscopic images allow a human observer to more accurately judge the distance of an object. However it is challenging for a computer program to judge object distance from a stereoscopic image. Errors are often present, and the required signal processing requires specialized hardware and computation. Stereoscopic images are at best an indirect way to produce a three-dimensional image suitable for direct computer use.
Many applications require directly obtaining a three-dimensional rendering of a scene. But in practice it is difficult to accurately extract distance and velocity data along a viewing axis from luminosity measurements. Nonetheless many applications require accurate distance and velocity tracking, for example an assembly line welding robot that must determine the precise distance and speed of the object to be welded. The necessary distance measurements may be erroneous due to varying lighting conditions and other shortcomings noted above. Such applications would benefit from a system that could directly capture three-dimensional imagery.
Although specialized three dimensional imaging systems exist in the nuclear magnetic resonance and scanning laser tomography fields, such systems require substantial equipment expenditures. Further, these systems are obtrusive, and are dedicated to specific tasks, e.g., imaging internal body organs.
In other applications, scanning laser range finding systems raster scan an image by using mirrors to deflect a laser beam in the x-axis and perhaps the y-axis plane. The angle of defection of each mirror is used to determine the coordinate of an image pixel being sampled. Such systems require precision detection of the angle of each mirror to determine which pixel is currently being sampled. Understandably having to provide precision moving mechanical parts add bulk, complexity, and cost to such range finding system. Further, because these systems sample each pixel sequentially, the number of complete image frames that can be sampled per unit time is limited.
In summation, there is a need for a system that can produce direct three-dimensional imaging. Preferably such system should be implementable on a single IC that includes both detectors and circuitry to process detection signals. Such single IC system should be implementable using CMOS fabrication techniques, should require few discrete components and have no moving components. Optionally, the system should be able to output data from the detectors in a non-sequential or random fashion. Very preferably, such system should require relatively low peak light emitting power such that inexpensive light emitters may be employed.
The present invention provides such a system.
SUMMARY OF THE PRESENT INVENTION
The present invention provides a system that measures distance and velocity data in real time using time-of-flight (TOF) data rather than relying upon luminosity data. The system is CMOS-compatible and provides such three-dimensional imaging without requiring moving parts. The system may be fabricated on a single IC containing both a two-dim

LandOfFree

Say what you really think

Search LandOfFree.com for the USA inventors and patents. Rate them and share your experience with other people.

Rating

CMOS-compatible three-dimensional image sensing using... does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with CMOS-compatible three-dimensional image sensing using..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and CMOS-compatible three-dimensional image sensing using... will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFUS-PAI-O-3002833

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.